Files in this item
Classification of drones and birds using convolutional neural networks applied to radar micro-Doppler spectrogram images
Item metadata
dc.contributor.author | Rahman, Samiur | |
dc.contributor.author | Robertson, Duncan Alexander | |
dc.date.accessioned | 2020-01-27T17:30:02Z | |
dc.date.available | 2020-01-27T17:30:02Z | |
dc.date.issued | 2020-03-26 | |
dc.identifier.citation | Rahman , S & Robertson , D A 2020 , ' Classification of drones and birds using convolutional neural networks applied to radar micro-Doppler spectrogram images ' , IET Radar Sonar and Navigation , vol. 14 , no. 5 , pp. 653-661 . https://doi.org/10.1049/iet-rsn.2019.0493 | en |
dc.identifier.issn | 1751-8784 | |
dc.identifier.other | PURE: 266010549 | |
dc.identifier.other | PURE UUID: deb7a8c2-7f30-4ce2-a6e4-e78d27fbf120 | |
dc.identifier.other | ORCID: /0000-0002-4042-2772/work/68281192 | |
dc.identifier.other | ORCID: /0000-0002-5477-4218/work/68281762 | |
dc.identifier.other | Scopus: 85082518259 | |
dc.identifier.other | WOS: 000526412400001 | |
dc.identifier.uri | http://hdl.handle.net/10023/19360 | |
dc.description | Funding: UK Science and Technology Facilities Council ST/N006569/1 (DR). | en |
dc.description.abstract | This study presents a convolutional neural network (CNN) based drone classification method. The primary criterion for a high-fidelity neural network based classification is a real dataset of large size and diversity for training. The first goal of the study was to create a large database of micro-Doppler spectrogram images of in-flight drones and birds. Two separate datasets with the same images have been created, one with RGB images and other with grayscale images. The RGB dataset was used for GoogLeNet architecture-based training. The grayscale dataset was used for training with a series architecture developed during this study. Each dataset was further divided into two categories, one with four classes (drone, bird, clutter and noise) and the other with two classes (drone and non-drone). During training, 20% of the dataset has been used as a validation set. After the completion of training, the models were tested with previously unseen and unlabelled sets of data. The validation and testing accuracy for the developed series network have been found to be 99.6% and 94.4% respectively for four classes and 99.3% and 98.3% respectively for two classes. The GoogLenet based model showed both validation and testing accuracies to be around 99% for all the cases. | |
dc.format.extent | 9 | |
dc.language.iso | eng | |
dc.relation.ispartof | IET Radar Sonar and Navigation | en |
dc.rights | Copyright © 2019 Institution of Engineering and Technology. This work has been made available online in accordance with publisher policies or with permission. Permission for further reuse of this content should be sought from the publisher or the rights holder. This is the author created accepted manuscript following peer review and may differ slightly from the final published version. The final published version of this work is available at https://doi.org/10.1049/iet-rsn.2019.0493 | en |
dc.subject | Neural network | en |
dc.subject | FMCW Doppler | en |
dc.subject | Radar | en |
dc.subject | Target classification | en |
dc.subject | CNN | en |
dc.subject | Drone | en |
dc.subject | Bird | en |
dc.subject | QA75 Electronic computers. Computer science | en |
dc.subject | QC Physics | en |
dc.subject | T Technology | en |
dc.subject | NDAS | en |
dc.subject.lcc | QA75 | en |
dc.subject.lcc | QC | en |
dc.subject.lcc | T | en |
dc.title | Classification of drones and birds using convolutional neural networks applied to radar micro-Doppler spectrogram images | en |
dc.type | Journal article | en |
dc.contributor.sponsor | Science & Technology Facilities Council | en |
dc.description.version | Postprint | en |
dc.contributor.institution | University of St Andrews. School of Physics and Astronomy | en |
dc.identifier.doi | https://doi.org/10.1049/iet-rsn.2019.0493 | |
dc.description.status | Peer reviewed | en |
dc.identifier.grantnumber | ST/N006569/1 | en |
This item appears in the following Collection(s)
Items in the St Andrews Research Repository are protected by copyright, with all rights reserved, unless otherwise indicated.