Auto-association by multilayer perceptrons and singular value decomposition
- 1 September 1988
- journal article
- research article
- Published by Springer Nature in Biological Cybernetics
- Vol. 59 (4-5) , 291-294
- https://doi.org/10.1007/bf00332918
Abstract
The multilayer perceptron, when working in auto-association mode, is sometimes considered as an interesting candidate to perform data compression or dimensionality reduction of the feature space in information processing applications. The present paper shows that, for auto-association, the nonlinearities of the hidden units are useless and that the optimal parameter values can be derived directly by purely linear techniques relying on singular value decomposition and low rank matrix approximation, similar in spirit to the well-known Karhunen-Loève transform. This approach appears thus as an efficient alternative to the general error back-propagation algorithm commonly used for training multilayer perceptrons. Moreover, it also gives a clear interpretation of the rôle of the different parameters.This publication has 6 references indexed in Scilit:
- Low Rank Matrices with a Given Sign PatternSIAM Journal on Discrete Mathematics, 1989
- Learning the hidden structure of speechThe Journal of the Acoustical Society of America, 1988
- An introduction to computing with neural netsIEEE ASSP Magazine, 1987
- Parallel Distributed ProcessingPublished by MIT Press ,1986
- Updating the singular value decompositionNumerische Mathematik, 1978
- Orthogonal Transforms for Digital Signal ProcessingPublished by Springer Nature ,1975