St Andrews Research Repository

St Andrews University Home
View Item 
  •   St Andrews Research Repository
  • University of St Andrews Research
  • University of St Andrews Research
  • University of St Andrews Research
  • View Item
  •   St Andrews Research Repository
  • University of St Andrews Research
  • University of St Andrews Research
  • University of St Andrews Research
  • View Item
  •   St Andrews Research Repository
  • University of St Andrews Research
  • University of St Andrews Research
  • University of St Andrews Research
  • View Item
  • Register / Login
JavaScript is disabled for your browser. Some features of this site may not work without it.

Learning deep models from synthetic data for extracting dolphin whistle contours

Thumbnail
View/Open
PID6435539.pdf (3.083Mb)
Date
07/2020
Author
Li, Pu
Liu, Xiaobai
Palmer, Kaitlin
Fleishman, Erica
Gillespie, Douglas Michael
Nosal, Eva-Marie
Shiu, Yu
Klinck, Holger
Cholewiak, Danielle
Helble, Tyler
Roch, Marie
Keywords
Whistle contour extraction
Deep neural network
Data synthesis
Acoustic
Odontocetes
QA75 Electronic computers. Computer science
QH301 Biology
Software
Artificial Intelligence
3rd-DAS
Metadata
Show full item record
Abstract
We present a learning-based method for extracting whistles of toothed whales (Odontoceti) in hydrophone recordings. Our method represents audio signals as time-frequency spectrograms and decomposes each spectrogram into a set of time-frequency patches. A deep neural network learns archetypical patterns (e.g., crossings, frequency modulated sweeps) from the spectrogram patches and predicts time-frequency peaks that are associated with whistles. We also developed a comprehensive method to synthesize training samples from background environments and train the network with minimal human annotation effort. We applied the proposed learn-from-synthesis method to a subset of the public Detection, Classification, Localization, and Density Estimation (DCLDE) 2011 workshop data to extract whistle confidence maps, which we then processed with an existing contour extractor to produce whistle annotations. The F1-score of our best synthesis method was 0.158 greater than our baseline whistle extraction algorithm (~25% improvement) when applied to common dolphin (Delphinus spp.) and bottlenose dolphin (Tursiops truncatus) whistles.
Citation
Li , P , Liu , X , Palmer , K , Fleishman , E , Gillespie , D M , Nosal , E-M , Shiu , Y , Klinck , H , Cholewiak , D , Helble , T & Roch , M 2020 , Learning deep models from synthetic data for extracting dolphin whistle contours . in 2020 International Joint Conference on Neural Networks, IJCNN 2020 - Proceedings . , 9206992 , Proceedings of the International Joint Conference on Neural Networks , IEEE Computer Society , IEEE World Congress on Computational Intelligence (IEEE WCCI) - 2020 International Joint Conference on Neural Networks (IJCNN 2020) , Glasgow , United Kingdom , 19/07/20 . https://doi.org/10.1109/IJCNN48605.2020.9206992
 
conference
 
Publication
2020 International Joint Conference on Neural Networks, IJCNN 2020 - Proceedings
DOI
https://doi.org/10.1109/IJCNN48605.2020.9206992
Type
Conference item
Rights
Copyright © 2020 IEEE. This work has been made available online in accordance with publisher policies or with permission. Permission for further reuse of this content should be sought from the publisher or the rights holder. This is the author created accepted manuscript following peer review and may differ slightly from the final published version. The final published version of this work is available at https://ieeexplore.ieee.org
Collections
  • University of St Andrews Research
URI
http://hdl.handle.net/10023/19834

Items in the St Andrews Research Repository are protected by copyright, with all rights reserved, unless otherwise indicated.

Advanced Search

Browse

All of RepositoryCommunities & CollectionsBy Issue DateNamesTitlesSubjectsClassificationTypeFunderThis CollectionBy Issue DateNamesTitlesSubjectsClassificationTypeFunder

My Account

Login

Open Access

To find out how you can benefit from open access to research, see our library web pages and Open Access blog. For open access help contact: openaccess@st-andrews.ac.uk.

Accessibility

Read our Accessibility statement.

How to submit research papers

The full text of research papers can be submitted to the repository via Pure, the University's research information system. For help see our guide: How to deposit in Pure.

Electronic thesis deposit

Help with deposit.

Repository help

For repository help contact: Digital-Repository@st-andrews.ac.uk.

Give Feedback

Cookie policy

This site may use cookies. Please see Terms and Conditions.

Usage statistics

COUNTER-compliant statistics on downloads from the repository are available from the IRUS-UK Service. Contact us for information.

© University of St Andrews Library

University of St Andrews is a charity registered in Scotland, No SC013532.

  • Facebook
  • Twitter