Duality Between Learning Machines: A Bridge Between Supervised and Unsupervised Learning
- 1 May 1994
- journal article
- Published by MIT Press in Neural Computation
- Vol. 6 (3) , 491-508
- https://doi.org/10.1162/neco.1994.6.3.491
Abstract
We exhibit a duality between two perceptrons that allows us to compare the theoretical analysis of supervised and unsupervised learning tasks. The first perceptron has one output and is asked to learn a classification of p patterns. The second (dual) perceptron has p outputs and is asked to transmit as much information as possible on a distribution of inputs. We show in particular that the maximum information that can be stored in the couplings for the supervised learning task is equal to the maximum information that can be transmitted by the dual perceptron.Keywords
This publication has 14 references indexed in Scilit:
- Information capacity of a perceptronJournal of Physics A: General Physics, 1992
- Could information theory provide an ecological theory of sensory processing?Network: Computation in Neural Systems, 1992
- Capacity of neural networks with discrete synaptic couplingsJournal of Physics A: General Physics, 1990
- Unsupervised LearningNeural Computation, 1989
- Storage capacity of memory networks with binary couplingsJournal de Physique, 1989
- Understanding the Efficiency of Human PerceptionPhysical Review Letters, 1988
- Self-organization in a perceptual networkComputer, 1988
- The space of interactions in neural network modelsJournal of Physics A: General Physics, 1988
- Neural networks and physical systems with emergent collective computational abilities.Proceedings of the National Academy of Sciences, 1982
- Geometrical and Statistical Properties of Systems of Linear Inequalities with Applications in Pattern RecognitionIEEE Transactions on Electronic Computers, 1965