Entropy and information in neural spike trains: Progress on the sampling problem
- 24 May 2004
- journal article
- research article
- Published by American Physical Society (APS) in Physical Review E
- Vol. 69 (5) , 056111
- https://doi.org/10.1103/physreve.69.056111
Abstract
The major problem in information theoretic analysis of neural responses and other biological data is the reliable estimation of entropy-like quantities from small samples. We apply a recently introduced Bayesian entropy estimator to synthetic data inspired by experiments, and to real experimental spike trains. The estimator performs admirably even very deep in the undersampled regime, where other techniques fail. This opens new possibilities for the information theoretic analysis of experiments, and may be of general interest as an example of learning from limited data.Keywords
All Related Versions
This publication has 23 references indexed in Scilit:
- Binless strategies for estimation of information from neural dataPhysical Review E, 2002
- Predictability, Complexity, and LearningNeural Computation, 2001
- Neural coding of naturalistic motion stimuliNetwork: Computation in Neural Systems, 2001
- Synergy in a Neural CodeNeural Computation, 2000
- Metric-space analysis of spike trains: theory, algorithms and applicationNetwork: Computation in Neural Systems, 1997
- Reproducibility and Variability in Neural Spike TrainsScience, 1997
- Estimating functions of probability distributions from a finite set of samplesPhysical Review E, 1995
- Reliability of Spike Timing in Neocortical NeuronsScience, 1995
- Information-theoretic asymptotics of Bayes methodsIEEE Transactions on Information Theory, 1990
- Calculation of entropy from data of motionJournal of Statistical Physics, 1981