Replicator Neural Networks for Universal Optimal Source Coding
- 29 September 1995
- journal article
- other
- Published by American Association for the Advancement of Science (AAAS) in Science
- Vol. 269 (5232) , 1860-1863
- https://doi.org/10.1126/science.269.5232.1860
Abstract
Replicator neural networks self-organize by using their inputs as desired outputs; they internally form a compressed representation for the input data. A theorem shows that a class of replicator networks can, through the minimization of mean squared reconstruction error (for instance, by training on raw data examples), carry out optimal data compression for arbitrary data vector sources. Data manifolds, a new general model of data sources, are then introduced and a second theorem shows that, in a practically important limiting case, optimal-compression replicator networks operate by creating an essentially unique natural coordinate system for the manifold.Keywords
This publication has 30 references indexed in Scilit:
- A Learning Algorithm for Boltzmann Machines*Published by Wiley ,2010
- Dimension Reduction of Biological Neuron Models by Artificial Neural NetworksNeural Computation, 1994
- Uniform and piecewise uniform lattice vector quantization for memoryless Gaussian and Laplacian sourcesIEEE Transactions on Information Theory, 1993
- Statistical Theory of Learning Curves under Entropic Loss CriterionNeural Computation, 1993
- A vector quantizer for the Laplace sourceIEEE Transactions on Information Theory, 1991
- Principal CurvesJournal of the American Statistical Association, 1989
- Principal CurvesJournal of the American Statistical Association, 1989
- Geometric source coding and vector quantizationIEEE Transactions on Information Theory, 1989
- Asymptotic quantization error of continuous signals and the quantization dimensionIEEE Transactions on Information Theory, 1982
- Asymptotically optimal block quantizationIEEE Transactions on Information Theory, 1979