Show simple item record

Files in this item

Thumbnail

Item metadata

dc.contributor.authorLove, Teri
dc.contributor.authorNeukirch, Thomas
dc.contributor.authorParnell, Clare E.
dc.date.accessioned2020-06-26T08:30:01Z
dc.date.available2020-06-26T08:30:01Z
dc.date.issued2020-06-26
dc.identifier.citationLove , T , Neukirch , T & Parnell , C E 2020 , ' Analyzing AIA flare observations using convolutional neural networks ' , Frontiers in Astronomy and Space Sciences , vol. 7 , 34 . https://doi.org/10.3389/fspas.2020.00034en
dc.identifier.issn2296-987X
dc.identifier.otherPURE: 268157496
dc.identifier.otherPURE UUID: 2f81c043-918a-4791-b052-cb660ea3a3a4
dc.identifier.otherORCID: /0000-0002-5694-9069/work/76386632
dc.identifier.otherORCID: /0000-0002-7597-4980/work/76386774
dc.identifier.otherWOS: 000551790700001
dc.identifier.otherScopus: 85115706284
dc.identifier.urihttps://hdl.handle.net/10023/20158
dc.descriptionTL acknowledges support by the UK's Science and Technology Facilities Council (STFC) Doctoral Training Centre Grant ST/P006809/1 (ScotDIST). TN and CP both acknowledge support by the STFC Consolidated Grant ST/S000402/1.en
dc.description.abstractIn order to efficiently analyse the vast amount of data generated by solar space missions and ground-based instruments, modern machine learning techniques such as decision trees, support vector machines (SVMs) and neural networks can be very useful. In this paper we present initial results from using a convolutional neural network (CNN) to analyse observations from the Atmospheric Imaging Assembly (AIA) in the 1,600Å wavelength. The data is pre-processed to locate flaring regions where flare ribbons are visible in the observations. The CNN is created and trained to automatically analyse the shape and position of the flare ribbons, by identifying whether each image belongs into one of four classes: two-ribbon flare, compact/circular ribbon flare, limb flare, or quiet Sun, with the final class acting as a control for any data included in the training or test sets where flaring regions are not present. The network created can classify flare ribbon observations into any of the four classes with a final accuracy of 94%. Initial results show that most of the images are correctly classified with the compact flare class being the only class where accuracy drops below 90% and some observations are wrongly classified as belonging to the limb class.
dc.format.extent8
dc.language.isoeng
dc.relation.ispartofFrontiers in Astronomy and Space Sciencesen
dc.rightsCopyright © 2020 Love, Neukirch and Parnell. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.en
dc.subjectConvolutional neural networken
dc.subjectSolar flaresen
dc.subjectFlare ribbonsen
dc.subjectMachine learningen
dc.subjectClassificationen
dc.subjectHelio19en
dc.subjectQB Astronomyen
dc.subjectDASen
dc.subject.lccQBen
dc.titleAnalyzing AIA flare observations using convolutional neural networksen
dc.typeJournal articleen
dc.contributor.sponsorScience & Technology Facilities Councilen
dc.description.versionPublisher PDFen
dc.contributor.institutionUniversity of St Andrews. Applied Mathematicsen
dc.identifier.doihttps://doi.org/10.3389/fspas.2020.00034
dc.description.statusPeer revieweden
dc.identifier.grantnumberST/S000402/1en


This item appears in the following Collection(s)

Show simple item record