Show simple item record

Files in this item

Thumbnail

Item metadata

dc.contributor.authorEndres, Dominik Maria
dc.contributor.authorFoldiak, Peter
dc.date.accessioned2010-12-02T14:18:57Z
dc.date.available2010-12-02T14:18:57Z
dc.date.issued2005-11
dc.identifier.citationEndres , D M & Foldiak , P 2005 , ' Bayesian bin distribution inference and mutual information ' , IEEE Transactions on Information Theory , vol. 51 , no. 11 , pp. 3766-3779 . https://doi.org/10.1109/TIT.2005.856954en
dc.identifier.issn0018-9448
dc.identifier.otherPURE: 293246
dc.identifier.otherPURE UUID: 1188bdc6-6ab4-4e3c-b225-e09fd3c13354
dc.identifier.otherWOS: 000233047800006
dc.identifier.otherScopus: 27744440784
dc.identifier.urihttps://hdl.handle.net/10023/1592
dc.description.abstractWe present an exact Bayesian treatment of a simple, yet sufficiently general probability distribution model. We consider piecewise-constant distributions' P(X) with uniform (second-order) prior over location of discontinuity points and assigned chances. The predictive distribution and the model complexity can be determined completely from the data in a computational time that is linear in the number of degrees of freedom and quadratic in the number of possible values of X. Furthermore, exact values of the expectations of entropies and their variances can be computed with polynomial effort. The expectation of the mutual information becomes thus available, too, and a strict upper bound on its variance. The resulting algorithm is particularly useful in experimental research areas where the number of available samples is severely limited (e.g., neurophysiology). Estimates on a simulated data set provide more accurate results than using a previously proposed method.
dc.format.extent14
dc.language.isoeng
dc.relation.ispartofIEEE Transactions on Information Theoryen
dc.rights(c) 2005 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other worksen
dc.subjectBayesian inferenceen
dc.subjectEntropyen
dc.subjectModel selectionen
dc.subjectMutual informationen
dc.subjectQA Mathematicsen
dc.subject.lccQAen
dc.titleBayesian bin distribution inference and mutual informationen
dc.typeJournal articleen
dc.description.versionPublisher PDFen
dc.contributor.institutionUniversity of St Andrews. School of Psychology and Neuroscienceen
dc.identifier.doihttps://doi.org/10.1109/TIT.2005.856954
dc.description.statusPeer revieweden
dc.identifier.urlhttp://www.scopus.com/inward/record.url?scp=27744440784&partnerID=8YFLogxKen


This item appears in the following Collection(s)

Show simple item record