A "Thermal" Perceptron Learning Rule
- 1 November 1992
- journal article
- Published by MIT Press in Neural Computation
- Vol. 4 (6) , 946-957
- https://doi.org/10.1162/neco.1992.4.6.946
Abstract
The thermal perceptron is a simple extension to Rosenblatt's perceptron learning rule for training individual linear threshold units. It finds stable weights for nonseparable problems as well as separable ones. Experiments indicate that if a good initial setting for a temperature parameter, T0, has been found, then the thermal perceptron outperforms the Pocket algorithm and methods based on gradient descent. The learning rule stabilizes the weights (learns) over a fixed training period. For separable problems it finds separating weights much more quickly than the usual rules.Keywords
This publication has 7 references indexed in Scilit:
- The Offset Algorithm: Building and Learning Method for Multilayer Neural NetworksEurophysics Letters, 1992
- 3-D OBJECT CLASSIFICATION: APPLICATION OF A CONSTRUCTIVE ALGORITHMInternational Journal of Neural Systems, 1991
- The Upstart Algorithm: A Method for Constructing and Training Feedforward Neural NetworksNeural Computation, 1990
- Perceptron-based learning algorithmsIEEE Transactions on Neural Networks, 1990
- A Growth Algorithm for Neural Network Decision TreesEurophysics Letters, 1990
- Connectionist learning proceduresArtificial Intelligence, 1989
- Analysis of a Four-Layer Series-Coupled Perceptron. IIReviews of Modern Physics, 1962