Files in this item
Computational inference of neural information flow networks
Item metadata
dc.contributor.author | Smith, Victoria Anne | |
dc.contributor.author | Yu, Jing | |
dc.contributor.author | Smulders, Tom V | |
dc.contributor.author | Hartemink, Alex J | |
dc.contributor.author | Jarvis, Erich D | |
dc.date.accessioned | 2010-12-02T11:23:48Z | |
dc.date.available | 2010-12-02T11:23:48Z | |
dc.date.issued | 2006-11 | |
dc.identifier.citation | Smith , V A , Yu , J , Smulders , T V , Hartemink , A J & Jarvis , E D 2006 , ' Computational inference of neural information flow networks ' , PLoS Computational Biology , vol. 2 , no. 11 , pp. e161 . https://doi.org/10.1371/journal.pcbi.0020161 | en |
dc.identifier.issn | 1553-734X | |
dc.identifier.other | PURE: 339627 | |
dc.identifier.other | PURE UUID: 7061c4f6-1446-4a0d-b69e-cf55dd06ee83 | |
dc.identifier.other | standrews_research_output: 14256 | |
dc.identifier.other | Scopus: 33751407959 | |
dc.identifier.other | ORCID: /0000-0002-0487-2469/work/32209215 | |
dc.identifier.uri | https://hdl.handle.net/10023/1586 | |
dc.description | This research was supported by a Packard Foundation grant and a US National Science Foundation (NSF) Waterman Award to EDJ, an NSF CAREER grant and an Alfred P. Sloan Fellowship to AJH, and a US National Institutes of Health R01 DC7996 grant from National Institute on Deafness and Other Communication Disorders to support the collaboration between AJH and EDJ. The original collaboration between AJH and EDJ was supported by a Duke University Bioinformatics grant. | en |
dc.description.abstract | Determining how information flows along anatomical brain pathways is a fundamental requirement for understanding how animals perceive their environments, learn, and behave. Attempts to reveal such neural information flow have been made using linear computational methods, but neural interactions are known to be nonlinear. Here, we demonstrate that a dynamic Bayesian network (DBN) inference algorithm we originally developed to infer nonlinear transcriptional regulatory networks from gene expression data collected with microarrays is also successful at inferring nonlinear neural information flow networks from electrophysiology data collected with microelectrode arrays. The inferred networks we recover from the songbird auditory pathway are correctly restricted to a subset of known anatomical paths, are consistent with timing of the system, and reveal both the importance of reciprocal feedback in auditory processing and greater information flow to higher-order auditory areas when birds hear natural as opposed to synthetic sounds. A linear method applied to the same data incorrectly produces networks with information flow to non-neural tissue and over paths known not to exist. To our knowledge, this study represents the first biologically validated demonstration of an algorithm to successfully infer neural information flow networks. | |
dc.format.extent | 14 | |
dc.language.iso | eng | |
dc.relation.ispartof | PLoS Computational Biology | en |
dc.rights | © 2006 Smith et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. | en |
dc.subject | QH301 Biology | en |
dc.subject.lcc | QH301 | en |
dc.title | Computational inference of neural information flow networks | en |
dc.type | Journal article | en |
dc.description.version | Publisher PDF | en |
dc.contributor.institution | University of St Andrews. School of Biology | en |
dc.identifier.doi | https://doi.org/10.1371/journal.pcbi.0020161 | |
dc.description.status | Peer reviewed | en |
dc.identifier.url | http://www.scopus.com/inward/record.url?scp=33751407959&partnerID=8YFLogxK | en |
dc.identifier.url | http://compbiol.plosjournals.org/perlserv/?request=get-document&doi=10.1371%2Fjournal.pcbi.0020161 | en |
This item appears in the following Collection(s)
Items in the St Andrews Research Repository are protected by copyright, with all rights reserved, unless otherwise indicated.