Quantifying Stimulus Discriminability: A Comparison of Information Theory and Ideal Observer Analysis
- 1 April 2005
- journal article
- research article
- Published by MIT Press in Neural Computation
- Vol. 17 (4) , 741-778
- https://doi.org/10.1162/0899766053429435
Abstract
Performance in sensory discrimination tasks is commonly quantified using either information theory or ideal observer analysis. These two quantitative frameworks are often assumed to be equivalent. For example, higher mutual information is said to correspond to improved performance of an ideal observer in a stimulus estimation task. To the contrary, drawing on and extending previous results, we show that five information-theoretic quantities (entropy, response-conditional entropy, specific information, equivocation, and mutual information) violate this assumption. More positively, we show how these information measures can be used to calculate upper and lower bounds on ideal observer performance, and vice versa. The results show that the mathematical resources of ideal observer analysis are preferable to information theory for evaluating performance in a stimulus discrimination task. We also discuss the applicability of information theory to questions that ideal observer analysis cannot address.Keywords
This publication has 28 references indexed in Scilit:
- Whisker Vibration Information Carried by Rat Barrel Cortex NeuronsJournal of Neuroscience, 2004
- Predictability, Complexity, and LearningNeural Computation, 2001
- Synergy in a Neural CodeNeural Computation, 2000
- How to measure the information gained from one symbolNetwork: Computation in Neural Systems, 1999
- Gauging sensory representations in the brainTrends in Neurosciences, 1999
- Characterizing complex chemosensors: information-theoretic analysis of olfactory systemsTrends in Neurosciences, 1999
- Refractoriness and Neural PrecisionJournal of Neuroscience, 1998
- Classification-algorithm evaluation: Five performance measures based onconfusion matricesJournal of Clinical Monitoring and Computing, 1995
- Relations between entropy and error probabilityIEEE Transactions on Information Theory, 1994
- The analysis of visual motion: a comparison of neuronal and psychophysical performanceJournal of Neuroscience, 1992