AN INCREMENTAL FINE ADJUSTMENT ALGORITHM FOR THE DESIGN OF OPTIMAL INTERPOLATING NEURAL NETWORKS
- 1 October 1991
- journal article
- research article
- Published by World Scientific Pub Co Pte Ltd in International Journal of Pattern Recognition and Artificial Intelligence
- Vol. 05 (04) , 563-579
- https://doi.org/10.1142/s0218001491000326
Abstract
A two-stage design method for artificial neural networks is presented. The first stage is an evolutionary RLS (recursive least squares) algorithm which determines the optimal configuration of the net based on the concept of optimal interpolation. During this stage, the members of a given sample set are processed sequentially and a small consistent subset, constituting what we call prototypes, is selected to form the building blocks of the net. The synaptic weights as well as the internal dimensions of the net are updated recursively as each new prototype is selected. The evolving net at each intermediate step is a modified version of the Optimal Interpolative (OI) net derived in a recent paper by one of the authors. The concept of an evolving network configuration is attractive since it does not require the prescription of a fixed configuration in order to learn the optimal synaptic weights. This can eventually lead to a network architecture which is only as complex as it needs to achieve a required interpolation function. The second stage is for the fine adjustment of the synaptic weights of the network structure acquired during the first stage. This stage is a two-step iterative optimization procedure using the method of steepest descent. The initial values of the synaptic weights in the iterative search are obtained from the first stage. It is seen that they are indeed very close to the optimal values. Hence, fast convergence during the second stage is guaranteed.Keywords
This publication has 0 references indexed in Scilit: