Quantization Complexity and Independent Measurements
- 1 January 1974
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Computers
- Vol. C-23 (1) , 102-106
- https://doi.org/10.1109/t-c.1974.223789
Abstract
It is known that, in general, the number of measurements in a pattern classification problem cannot be increased arbitrarily, when the class-conditional densities are not completely known and only a finite number of learning samples are available. Above a certain number of measurements, the performance starts deteriorating instead of improving steadily. It was earlier shown by one of the authors that an exception to this "curse of finite sample size" is constituted by the case of binary independent measurements if a Bayesian approach is taken and uniform a priori on the unknown parameters are assumed. In this paper, the following generalizations are considered: arbitrary quantization and the use of maximum likelihood estimates. Further, the existence of an optimal quantization complexity is demonstrated, and its relationship to both the dimensionality of the measurement vector and the sample size are discussed. It is shown that the optimum number of quantization levels decreases with increasing dimensionality for a fixed sample size, and increases with the sample size for fixed dimensionality.Keywords
This publication has 4 references indexed in Scilit:
- On dimensionality and sample size in statistical pattern classificationPattern Recognition, 1971
- Independence of measurements and the mean recognition accuracyIEEE Transactions on Information Theory, 1971
- Comments on "On the mean accuracy of statistical pattern recognizers" by Hughes, G. F.IEEE Transactions on Information Theory, 1969
- On the mean accuracy of statistical pattern recognizersIEEE Transactions on Information Theory, 1968