St Andrews Research Repository

St Andrews University Home
View Item 
  •   St Andrews Research Repository
  • University of St Andrews Research
  • University of St Andrews Research
  • University of St Andrews Research
  • View Item
  •   St Andrews Research Repository
  • University of St Andrews Research
  • University of St Andrews Research
  • University of St Andrews Research
  • View Item
  •   St Andrews Research Repository
  • University of St Andrews Research
  • University of St Andrews Research
  • University of St Andrews Research
  • View Item
  • Login
JavaScript is disabled for your browser. Some features of this site may not work without it.

Automated detection of Hainan gibbon calls for passive acoustic monitoring

Thumbnail
View/Open
Dufourq_2021_RSEC_automated_detection_CC.pdf (3.515Mb)
Date
08/04/2021
Author
Dufourq, Emmanuel
Durbach, Ian
Hansford, James P.
Hoepfner, Amanda
Ma, Heidi
Bryant, Jessica V.
Stender, Christina S.
Li, Wenyong
Liu, Zhiwei
Chen, Qing
Zhou, Zhaoli
Turvey, Samuel T.
Keywords
Bioacoustics
Convolutional neural networks
Deep learning
Hainan gibbons
Passive acoustic monitoring
Species identification
QA75 Electronic computers. Computer science
QH301 Biology
DAS
Metadata
Show full item record
Altmetrics Handle Statistics
Altmetrics DOI Statistics
Abstract
Extracting species calls from passive acoustic recordings is a common preliminary step to ecological analysis. For many species, particularly those occupying noisy, acoustically variable habitats, the call extraction process continues to be largely manual, a time-consuming and increasingly unsustainable process. Deep neural networks have been shown to offer excellent performance across a range of acoustic classification applications, but are relatively underused in ecology. We describe the steps involved in developing an automated classifier for a passive acoustic monitoring project, using the identification of calls of the Hainan gibbon Nomascus hainanus, one of the world's rarest mammal species, as a case study. This includes preprocessing-selecting a temporal resolution, windowing and annotation; data augmentation; processing-choosing and fitting appropriate neural network models; and post-processing-linking model predictions to replace, or more likely facilitate, manual labelling. Our best model converted acoustic recordings into spectrogram images on the mel frequency scale, using these to train a convolutional neural network. Model predictions were highly accurate, with per-second false positive and false negative rates of 1.5% and 22.3%. Nearly all false negatives were at the fringes of calls, adjacent to segments where the call was correctly identified, so that very few calls were missed altogether. A post-processing step identifying intervals of repeated calling reduced an 8-h recording to, on average, 22 min for manual processing, and did not miss any calling bouts over 72 h of test recordings. Gibbon calling bouts were detected regularly in multi-month recordings from all selected survey points within Bawangling National Nature Reserve, Hainan. We demonstrate that passive acoustic monitoring incorporating an automated classifier represents an effective tool for remote detection of one of the world's rarest and most threatened species. Our study highlights the viability of using neural networks to automate or greatly assist the manual labelling of data collected by passive acoustic monitoring projects. We emphasize that model development and implementation be informed and guided by ecological objectives, and increase accessibility of these tools with a series of notebooks that allow users to build and deploy their own acoustic classifiers.
Citation
Dufourq , E , Durbach , I , Hansford , J P , Hoepfner , A , Ma , H , Bryant , J V , Stender , C S , Li , W , Liu , Z , Chen , Q , Zhou , Z & Turvey , S T 2021 , ' Automated detection of Hainan gibbon calls for passive acoustic monitoring ' , Remote Sensing in Ecology and Conservation , vol. Early View . https://doi.org/10.1002/rse2.201
Publication
Remote Sensing in Ecology and Conservation
Status
Peer reviewed
DOI
https://doi.org/10.1002/rse2.201
ISSN
2056-3485
Type
Journal article
Rights
Copyright © 2021 The Authors. Remote Sensing in Ecology and Conservation published by John Wiley & Sons Ltd on behalf of Zoological Society of London. This is an open access article under the terms of the Creative Commons Attribution‐NonCommercial License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited and is not used for commercial purposes.
Description
Fieldwork was funded by an Arcus Foundation grant to STT and a Wildlife Acoustics grant to JVB. ID is supported in part by funding from the National Research Foundation of South Africa (Grant ID 90782, 105782). ED is supported by a postdoctoral fellowship from the African Institute for Mathematical Sciences South Africa, Stellenbosch University and the Next Einstein Initiative. This work was carried out with the aid of a grant from the International Development Research Centre, Ottawa, Canada (www.idrc.ca), and with financial support from the Government of Canada, provided through Global Affairs Canada (GAC; www.international.gc.ca).
Collections
  • University of St Andrews Research
URI
http://hdl.handle.net/10023/23004

Items in the St Andrews Research Repository are protected by copyright, with all rights reserved, unless otherwise indicated.

Advanced Search

Browse

All of RepositoryCommunities & CollectionsBy Issue DateNamesTitlesSubjectsClassificationTypeFunderThis CollectionBy Issue DateNamesTitlesSubjectsClassificationTypeFunder

My Account

Login

Open Access

To find out how you can benefit from open access to research, see our library web pages and Open Access blog. For open access help contact: openaccess@st-andrews.ac.uk.

Accessibility

Read our Accessibility statement.

How to submit research papers

The full text of research papers can be submitted to the repository via Pure, the University's research information system. For help see our guide: How to deposit in Pure.

Electronic thesis deposit

Help with deposit.

Repository help

For repository help contact: Digital-Repository@st-andrews.ac.uk.

Give Feedback

Cookie policy

This site may use cookies. Please see Terms and Conditions.

Usage statistics

COUNTER-compliant statistics on downloads from the repository are available from the IRUS-UK Service. Contact us for information.

© University of St Andrews Library

University of St Andrews is a charity registered in Scotland, No SC013532.

  • Facebook
  • Twitter