From distance sampling to spatial capture-recapture
MetadataShow full item record
Altmetrics Handle Statistics
Altmetrics DOI Statistics
Distance sampling and capture–recapture are the two most widely used wildlife abundance estimation methods. capture–recapture methods have only recently incorporated models for spatial distribution and there is an increasing tendency for distance sampling methods to incorporated spatial models rather than to rely on partly design-based spatial inference. In this overview we show how spatial models are central to modern distance sampling and that spatial capture–recapture models arise as an extension of distance sampling methods. Depending on the type of data recorded, they can be viewed as particular kinds of hierarchical binary regression, Poisson regression, survival or time-to-event models, with individuals’ locations as latent variables and a spatial model as the latent variable distribution. Incorporation of spatial models in these two methods provides new opportunities for drawing explicitly spatial inferences. Areas of likely future development include more sophisticated spatial and spatio-temporal modelling of individuals’ locations and movements, new methods for integrating spatial capture–recapture and other kinds of ecological survey data, and methods for dealing with the recapture uncertainty that often arise when “capture” consists of detection by a remote device like a camera trap or microphone.
Borchers , D L & Marques , T A 2017 , ' From distance sampling to spatial capture-recapture ' , AStA Advances in Statistical Analysis , vol. 101 , no. 4 , pp. 475-494 . https://doi.org/10.1007/s10182-016-0287-7
AStA Advances in Statistical Analysis
© The Author(s) 2017. Open Access. This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
DescriptionTAM thanks support by CEAUL (funded by FCT—Fundação para a Ciência e a Tecnologia, Portugal, through the Project UID/MAT/00006/2013).
Items in the St Andrews Research Repository are protected by copyright, with all rights reserved, unless otherwise indicated.