Abstract
This paper compares several statistical methods for analyzing neural feature selectivity with natural stimuli. Despite the non‐Gaussian character of correlations in natural stimuli, several relevant stimulus dimensions can be found by maximizing either information or, as is demonstrated here, variance. In the case of information, the relevance of each dimension is quantified by a Kullback–Leibler divergence between the full input probability distribution and that across inputs associated with positive neural responses, both projected onto that dimension. We demonstrate that least‐square matching of the nonlinear prediction based on several dimensions relevant to the recorded spike trains yields an optimization scheme similar to information maximization. The relevant dimensions are found as those that capture the most variance in neural response. The variance along a stimulus dimension is given by a Rényi divergence of order 2 instead of the Kullback–Leibler divergence used for maximizing information. Statistical errors expected for the two schemes are shown to be similar through both analytical and numerical calculations. However, in the asymptotic limit of large spike numbers, maximizing information results in smaller errors than variance optimization. Numerical simulations for model cells with different noise levels show that this trend persists, and possibly increases, when the number of spikes decreases. This makes the problem of finding relevant dimensions one of the examples where information‐theoretic approaches are no more data limited than the variance‐based measures. Variance and information optimization also outperform methods based on the spike‐triggered average for all numbers of spikes and neural noise levels. Copyright © 2007 John Wiley & Sons, Ltd.