Entropy as an Index of the Informational State of Neurons

Abstract
Techniques were developed for using the classical information theory descriptor, entropy, to quantify the “uncertainty” present in neuronal spike trains. Entropy was calculated on the basis of a method that describes the relative relationships of serially ordered interspike intervals by encoding the intervals as a series of symbols, each of which depicts the relative duration of two adjacent spike intervals. Each symbol, or set of symbols has a specific fractional entropy value, derived from its probability of occurrence; moreover, fractional entropy can describe the relative amount of “information” that is associated with the relative location of a given symbol in a string of symbols. Using spike trains from 12 single neurons in the cerebellar cortex of rats, we determined: (1) the mean and S.D. of information content of each symbol in each specific position in a group of symbols (2-4 symbols/group, based on 3-5 adjacent intervals), (2) the 4-symbol groups which had the least and the most average fractional entropy, (3) that the 4-symbol groups with both low and high fractional entropy had significant positive correlations with the probability of occurrence of those groups after a drug treatment (ethanol), and (4) that the degree of drug-induced change in the incidence of both low- and high-fractional entropy groups did not correlate with predrug entropy. Thus, the entropy of clusters of 3-5 adjacent spike intervals, when computed in this particular way, seems to be a useful measure or index of the informational state of neurons.