Bayesian bin distribution inference and mutual information
Abstract
We present an exact Bayesian treatment of a simple, yet sufficiently general probability distribution model. We consider piecewise-constant distributions' P(X) with uniform (second-order) prior over location of discontinuity points and assigned chances. The predictive distribution and the model complexity can be determined completely from the data in a computational time that is linear in the number of degrees of freedom and quadratic in the number of possible values of X. Furthermore, exact values of the expectations of entropies and their variances can be computed with polynomial effort. The expectation of the mutual information becomes thus available, too, and a strict upper bound on its variance. The resulting algorithm is particularly useful in experimental research areas where the number of available samples is severely limited (e.g., neurophysiology). Estimates on a simulated data set provide more accurate results than using a previously proposed method.
Citation
Endres , D M & Foldiak , P 2005 , ' Bayesian bin distribution inference and mutual information ' , IEEE Transactions on Information Theory , vol. 51 , no. 11 , pp. 3766-3779 . https://doi.org/10.1109/TIT.2005.856954
Publication
IEEE Transactions on Information Theory
Status
Peer reviewed
ISSN
0018-9448Type
Journal article
Rights
(c) 2005 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works
Collections
Items in the St Andrews Research Repository are protected by copyright, with all rights reserved, unless otherwise indicated.