Files in this item
A Double machine learning trend model for citizen science data
Item metadata
dc.contributor.author | Fink, Daniel | |
dc.contributor.author | Johnston, Alison | |
dc.contributor.author | Strimas-Mackey, Matt | |
dc.contributor.author | Auer, Tom | |
dc.contributor.author | Hochachka, Wesley M. | |
dc.contributor.author | Ligocki, Shawn | |
dc.contributor.author | Oldham Jaromczyk, Lauren | |
dc.contributor.author | Robinson, Orin | |
dc.contributor.author | Wood, Chris | |
dc.contributor.author | Kelling, Steve | |
dc.contributor.author | Rodewald, Amanda D. | |
dc.date.accessioned | 2023-07-24T12:30:05Z | |
dc.date.available | 2023-07-24T12:30:05Z | |
dc.date.issued | 2023-09-01 | |
dc.identifier.citation | Fink , D , Johnston , A , Strimas-Mackey , M , Auer , T , Hochachka , W M , Ligocki , S , Oldham Jaromczyk , L , Robinson , O , Wood , C , Kelling , S & Rodewald , A D 2023 , ' A Double machine learning trend model for citizen science data ' , Methods in Ecology and Evolution , vol. 14 , no. 9 , pp. 2435-2448 . https://doi.org/10.1111/2041-210X.14186 | en |
dc.identifier.issn | 2041-210X | |
dc.identifier.other | PURE: 291061643 | |
dc.identifier.other | PURE UUID: 5f6613f1-a3d7-4dee-ae21-22cdb0becc71 | |
dc.identifier.other | RIS: urn:9A184EB5B54C18AB9A0649AABBC81E58 | |
dc.identifier.other | ORCID: /0000-0001-8221-013X/work/139554355 | |
dc.identifier.other | Scopus: 85165394908 | |
dc.identifier.uri | http://hdl.handle.net/10023/28008 | |
dc.description | Funding: This work was funded by The Leon Levy Foundation, The Wolf Creek Foundation and the National Science Foundation (ABI sustaining: DBI-1939187). This work used Bridges2 at Pittsburgh Supercomputing Center and Anvil at Rosen Center for Advanced Computing at Purdue University through allocation DEB200010 from the Advanced Cyberinfrastructure Coordination Ecosystem: Services & Support (ACCESS) program, which is supported by National Science Foundation grants #2138259, #2138286, #2138307, #2137603 and #2138296. Our research was also funded through the 2017–2018 Belmont Forum and BiodivERsA joint call for research proposals, under the BiodivScen ERA-Net COFUND program, with financial support from the Academy of Finland (AKA, Univ. Turku: 326327, Univ. Helsinki: 326338), the Swedish Research Council (Formas, SLU: 2018-02440, Lund Univ.: 2018-02441), the Research Council of Norway (Forskningsrådet, NINA: 295767) and the U.S. National Science Foundation (NSF, Cornell Univ.: ICER-1927646). | en |
dc.description.abstract | 1. Citizen and community science datasets are typically collected using flexible protocols. These protocols enable large volumes of data to be collected globally every year; however, the consequence is that these protocols typically lack the structure necessary to maintain consistent sampling across years. This can result in complex and pronounced interannual changes in the observation process, which can complicate the estimation of population trends because population changes over time are confounded with changes in the observation process. 2. Here we describe a novel modelling approach designed to estimate spatially explicit species population trends while controlling for the interannual confounding common in citizen science data. The approach is based on Double machine learning, a statistical framework that uses machine learning (ML) methods to estimate population change and the propensity scores used to adjust for confounding discovered in the data. ML makes it possible to use large sets of features to control for confounding and to model spatial heterogeneity in trends. Additionally, we present a simulation method to identify and adjust for residual confounding missed by the propensity scores. 3. To illustrate the approach, we estimated species trends using data from the citizen science project eBird. We used a simulation study to assess the ability of the method to estimate spatially varying trends when faced with realistic confounding and temporal correlation. Results demonstrated the ability to distinguish between spatially constant and spatially varying trends. There were low error rates on the estimated direction of population change (increasing/decreasing) at each location and high correlations on the estimated magnitude of population change. 4. The ability to estimate spatially explicit trends while accounting for confounding inherent in citizen science data has the potential to fill important information gaps, helping to estimate population trends for species and/or regions lacking rigorous monitoring data. | |
dc.format.extent | 14 | |
dc.language.iso | eng | |
dc.relation.ispartof | Methods in Ecology and Evolution | en |
dc.rights | Copyright © 2023 The Authors. Methods in Ecology and Evolution published by John Wiley & Sons Ltd on behalf of British Ecological Society. This is an open access article under the terms of the Creative Commons Attribution-NonCommercial License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited and is not used for commercial purposes. | en |
dc.subject | Causal Forests | en |
dc.subject | Causal inference | en |
dc.subject | Citizen science | en |
dc.subject | Confounding | en |
dc.subject | Double machine learning | en |
dc.subject | Machine learning | en |
dc.subject | Propensity score | en |
dc.subject | Trends | en |
dc.subject | Ecology, Evolution, Behavior and Systematics | en |
dc.subject | Ecological Modelling | en |
dc.subject | DAS | en |
dc.subject | MCC | en |
dc.title | A Double machine learning trend model for citizen science data | en |
dc.type | Journal article | en |
dc.description.version | Publisher PDF | en |
dc.contributor.institution | University of St Andrews. Statistics | en |
dc.contributor.institution | University of St Andrews. Centre for Research into Ecological & Environmental Modelling | en |
dc.identifier.doi | https://doi.org/10.1111/2041-210X.14186 | |
dc.description.status | Peer reviewed | en |
This item appears in the following Collection(s)
Items in the St Andrews Research Repository are protected by copyright, with all rights reserved, unless otherwise indicated.