Eigenvalues of covariance matrices: Application to neural-network learning
- 6 May 1991
- journal article
- research article
- Published by American Physical Society (APS) in Physical Review Letters
- Vol. 66 (18) , 2396-2399
- https://doi.org/10.1103/physrevlett.66.2396
Abstract
The learing time of a simple neural-network model is obtained through an analytic computation of the eigenvalue spectrum for the Hessian matrix, which describes the second-order properties of the objective function in the space of coupling coefficients. The results are generic for symmetric matrices obtained by summing outer products of random vectors. The form of the eigenvalue distribution suggests new techniques for accelerating the learning process, and provides a theoretical justification for the choice of centered versus biased state variables.Keywords
This publication has 8 references indexed in Scilit:
- Backpropagation Applied to Handwritten Zip Code RecognitionNeural Computation, 1989
- Modeling Brain FunctionPublished by Cambridge University Press (CUP) ,1989
- Linear and Nonlinear Extension of the Pseudo-Inverse Solution for Learning Boolean FunctionsEurophysics Letters, 1989
- Modular Construction of Time-Delay Neural Networks for Speech RecognitionNeural Computation, 1989
- Learning representations by back-propagating errorsNature, 1986
- Infinite Number of Order Parameters for Spin-GlassesPhysical Review Letters, 1979
- The eigenvalue spectrum of a large symmetric random matrixJournal of Physics A: General Physics, 1976
- Solvable Model of a Spin-GlassPhysical Review Letters, 1975