Optimization principles for the neural code
- 1 May 1996
- journal article
- Published by Taylor & Francis in Network: Computation in Neural Systems
- Vol. 7 (2) , 325-331
- https://doi.org/10.1088/0954-898x/7/2/013
Abstract
Recent experiments show that the neural codes at work in a wide range of creatures share some common features. At first sight, these observations seem unrelated. However, we show that these features arise naturally in a linear filtered threshold crossing model when we set the threshold to maximize the transmitted information. This maximization process requires neural adaptation to not only the DC signal level, as in conventional light and dark adaptation, but also to the statistical structure of the signal and noise distributions. We also present a new approach for calculating the mutual information between a neuron's output spike train and any aspect of its input signal which does not require reconstruction of the input signal. This formulation is valid provided the correlations in the spike train are small, and we provide a procedure for checking this assumption. This paper is based on joint work (DeWeese M 1995 Optimization principles for the neural code, Dissertation, Princeton University). Preliminary results from the linear filtered threshold crossing model appeared in a previous proceedings (DeWeese M and Bialek W 1995 Information flow in sensory neurons, Nuovo Cimento D 17 733-8), and the conclusions we reached at that time have been reaffirmed by further analysis of the model.Keywords
This publication has 3 references indexed in Scilit:
- Statistical adaptation and optimal estimation in movement computation by the blowfly visual systemPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Bits and brains: Information flow in the nervous systemPhysica A: Statistical Mechanics and its Applications, 1993
- Communication in the Presence of NoiseProceedings of the IRE, 1949