Show simple item record

Files in this item

Thumbnail

Item metadata

dc.contributor.authorWijesinghe, Philip
dc.contributor.authorCorsetti, Stella
dc.contributor.authorChow, Darren J. X.
dc.contributor.authorSakata, Shuzo
dc.contributor.authorDunning, Kylie R.
dc.contributor.authorDholakia, Kishan
dc.date.accessioned2022-11-09T17:30:29Z
dc.date.available2022-11-09T17:30:29Z
dc.date.issued2022-11-02
dc.identifier281276818
dc.identifier3c495d47-d070-4562-aa58-d5d176d5bc69
dc.identifier85141058450
dc.identifier000877710800001
dc.identifier.citationWijesinghe , P , Corsetti , S , Chow , D J X , Sakata , S , Dunning , K R & Dholakia , K 2022 , ' Experimentally unsupervised deconvolution for light-sheet microscopy with propagation-invariant beams ' , Light: Science & Applications , vol. 11 , 319 . https://doi.org/10.1038/s41377-022-00975-6en
dc.identifier.issn2047-7538
dc.identifier.otherORCID: /0000-0002-8378-7261/work/122719886
dc.identifier.urihttps://hdl.handle.net/10023/26350
dc.descriptionFunding: This project was funded by the UK Engineering and Physical Sciences Research Council (grants EP/P030017/1 and EP/R004854/1), and has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement (EC-GA 871212) and H2020 FETOPEN project "Dynamic” (EC-GA 863203). P.W. was supported by the 1851 Research Fellowship from the Royal Commission. KRD was supported by a Mid-Career Fellowship from the Hospital Research Foundation (C-MCF-58-2019). K.D. acknowledges support from the Australian Research Council through a Laureate Fellowship. S.S. was funded by BBSRC (BB/M00905X/1).en
dc.description.abstractDeconvolution is a challenging inverse problem, particularly in techniques that employ complex engineered point-spread functions, such as microscopy with propagation-invariant beams. Here, we present a deep-learning method for deconvolution that, in lieu of end-to-end training with ground truths, is trained using known physics of the imaging system. Specifically, we train a generative adversarial network with images generated with the known point-spread function of the system, and combine this with unpaired experimental data that preserve perceptual content. Our method rapidly and robustly deconvolves and super-resolves microscopy images, demonstrating a two-fold improvement in image contrast to conventional deconvolution methods. In contrast to common end-to-end networks that often require 1000–10,000s paired images, our method is experimentally unsupervised and can be trained solely on a few hundred regions of interest. We demonstrate its performance on light-sheet microscopy with propagation-invariant Airy beams in oocytes, preimplantation embryos and excised brain tissue, as well as illustrate its utility for Bessel-beam LSM. This method aims to democratise learned methods for deconvolution, as it does not require data acquisition outwith the conventional imaging protocol.
dc.format.extent15
dc.format.extent4608806
dc.language.isoeng
dc.relation.ispartofLight: Science & Applicationsen
dc.subjectQA75 Electronic computers. Computer scienceen
dc.subjectQC Physicsen
dc.subjectDASen
dc.subjectMCCen
dc.subject.lccQA75en
dc.subject.lccQCen
dc.titleExperimentally unsupervised deconvolution for light-sheet microscopy with propagation-invariant beamsen
dc.typeJournal articleen
dc.contributor.sponsorEPSRCen
dc.contributor.sponsorEPSRCen
dc.contributor.sponsorEuropean Commissionen
dc.contributor.sponsorEuropean Commissionen
dc.contributor.institutionUniversity of St Andrews.School of Physics and Astronomyen
dc.contributor.institutionUniversity of St Andrews.Sir James Mackenzie Institute for Early Diagnosisen
dc.contributor.institutionUniversity of St Andrews.Centre for Biophotonicsen
dc.contributor.institutionUniversity of St Andrews.Institute of Behavioural and Neural Sciencesen
dc.contributor.institutionUniversity of St Andrews.Biomedical Sciences Research Complexen
dc.identifier.doi10.1038/s41377-022-00975-6
dc.description.statusPeer revieweden
dc.identifier.grantnumberEP/P030017/1en
dc.identifier.grantnumberEP/R004854/1en
dc.identifier.grantnumber871212en
dc.identifier.grantnumber863203en


This item appears in the following Collection(s)

Show simple item record