Performance surfaces of a single-layer perceptron
- 1 January 1990
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 1 (3) , 268-274
- https://doi.org/10.1109/72.80252
Abstract
A perceptron learning algorithm may be viewed as a steepest-descent method whereby an instantaneous performance function is iteratively minimized. An appropriate performance function for the most widely used perceptron algorithm is described and it is shown that the update term of the algorithm is the gradient of this function. An example is given of the corresponding performance surface based on Gaussian assumptions and it is shown that there is an infinity of stationary points. The performance surfaces of two related performance functions are examined. Computer simulations that demonstrate the convergence properties of the adaptive algorithms are given.Keywords
This publication has 6 references indexed in Scilit:
- Analysis of a perceptron learning algorithm with momentum updatingPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Convergence properties and stationary points of a perceptron learning algorithmProceedings of the IEEE, 1990
- Layered neural nets for pattern recognitionIEEE Transactions on Acoustics, Speech, and Signal Processing, 1988
- An introduction to computing with neural netsIEEE ASSP Magazine, 1987
- Parallel Distributed ProcessingPublished by MIT Press ,1986
- A useful theorem for nonlinear devices having Gaussian inputsIEEE Transactions on Information Theory, 1958