Sparse EEG/MEG source estimation via a group lasso
Date
12/06/2017Metadata
Show full item recordAltmetrics Handle Statistics
Altmetrics DOI Statistics
Abstract
Non-invasive recordings of human brain activity through electroencephalography (EEG) or magnetoencelphalography (MEG) are of value for both basic science and clinical applications in sensory, cognitive, and affective neuroscience. Here we introduce a new approach to estimating the intra-cranial sources of EEG/MEG activity measured from extra-cranial sensors. The approach is based on the group lasso, a sparse-prior inverse that has been adapted to take advantage of functionally-defined regions of interest for the definition of physiologically meaningful groups within a functionally-based common space. Detailed simulations using realistic source-geometries and data from a human Visual Evoked Potential experiment demonstrate that the group-lasso method has improved performance over traditional ℓ2 minimum-norm methods. In addition, we show that pooling source estimates across subjects over functionally defined regions of interest results in improvements in the accuracy of source estimates for both the group-lasso and minimum-norm approaches.
Citation
Lim , M , Ales , J M , Cottereau , B R , Hastie , T & Norcia , A M 2017 , ' Sparse EEG/MEG source estimation via a group lasso ' , PLoS One , vol. 12 , no. 6 , e0176835 . https://doi.org/10.1371/journal.pone.0176835
Publication
PLoS One
Status
Peer reviewed
ISSN
1932-6203Type
Journal article
Rights
© 2017 Lim et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Description
This work was supported by EY018875, National Institutes of Health; EY015790, National Institutes of Health; DMS-1007719, National Science Foundation; and RO1-EB001988-15, National Institutes of Health.Collections
Items in the St Andrews Research Repository are protected by copyright, with all rights reserved, unless otherwise indicated.