Files in this item
Removing human bottlenecks in bird classification using camera trap images and deep learning
Item metadata
dc.contributor.author | Chalmers, Carl | |
dc.contributor.author | Fergus, Paul | |
dc.contributor.author | Wich, Serge | |
dc.contributor.author | Longmore, Steven N. | |
dc.contributor.author | Walsh, Naomi Davies | |
dc.contributor.author | Stephens, Philip A. | |
dc.contributor.author | Sutherland, Chris | |
dc.contributor.author | Matthews, Naomi | |
dc.contributor.author | Mudde, Jens | |
dc.contributor.author | Nuseibeh, Amira | |
dc.date.accessioned | 2023-05-31T15:30:12Z | |
dc.date.available | 2023-05-31T15:30:12Z | |
dc.date.issued | 2023-05-18 | |
dc.identifier | 286942493 | |
dc.identifier | dd68042b-924d-467a-93d5-8ba7427505d0 | |
dc.identifier | 85160612596 | |
dc.identifier.citation | Chalmers , C , Fergus , P , Wich , S , Longmore , S N , Walsh , N D , Stephens , P A , Sutherland , C , Matthews , N , Mudde , J & Nuseibeh , A 2023 , ' Removing human bottlenecks in bird classification using camera trap images and deep learning ' , Remote Sensing , vol. 15 , no. 10 , 2638 . https://doi.org/10.3390/rs15102638 | en |
dc.identifier.issn | 2072-4292 | |
dc.identifier.other | Jisc: 1108386 | |
dc.identifier.other | ORCID: /0000-0003-2073-1751/work/136289066 | |
dc.identifier.uri | https://hdl.handle.net/10023/27714 | |
dc.description.abstract | Birds are important indicators for monitoring both biodiversity and habitat health; they also play a crucial role in ecosystem management. Declines in bird populations can result in reduced ecosystem services, including seed dispersal, pollination and pest control. Accurate and long-term monitoring of birds to identify species of concern while measuring the success of conservation interventions is essential for ecologists. However, monitoring is time-consuming, costly and often difficult to manage over long durations and at meaningfully large spatial scales. Technology such as camera traps, acoustic monitors and drones provide methods for non-invasive monitoring. There are two main problems with using camera traps for monitoring: (a) cameras generate many images, making it difficult to process and analyse the data in a timely manner; and (b) the high proportion of false positives hinders the processing and analysis for reporting. In this paper, we outline an approach for overcoming these issues by utilising deep learning for real-time classification of bird species and automated removal of false positives in camera trap data. Images are classified in real-time using a Faster-RCNN architecture. Images are transmitted over 3/4G cameras and processed using Graphical Processing Units (GPUs) to provide conservationists with key detection metrics, thereby removing the requirement for manual observations. Our models achieved an average sensitivity of 88.79%, a specificity of 98.16% and accuracy of 96.71%. This demonstrates the effectiveness of using deep learning for automatic bird monitoring. | |
dc.format.extent | 22 | |
dc.format.extent | 14092119 | |
dc.language.iso | eng | |
dc.relation.ispartof | Remote Sensing | en |
dc.subject | Conservation | en |
dc.subject | Object detection | en |
dc.subject | Image processing | en |
dc.subject | Modelling biodiversity | en |
dc.subject | Deep learning | en |
dc.subject | E-DAS | en |
dc.subject | MCC | en |
dc.title | Removing human bottlenecks in bird classification using camera trap images and deep learning | en |
dc.type | Journal article | en |
dc.contributor.institution | University of St Andrews. Statistics | en |
dc.contributor.institution | University of St Andrews. Centre for Research into Ecological & Environmental Modelling | en |
dc.identifier.doi | 10.3390/rs15102638 | |
dc.description.status | Peer reviewed | en |
This item appears in the following Collection(s)
Items in the St Andrews Research Repository are protected by copyright, with all rights reserved, unless otherwise indicated.