Identification of nonlinear systems using new dynamic neural network structures
- 24 August 2005
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE)
- p. 20-25 vol.1
- https://doi.org/10.1109/cdc.1992.371800
Abstract
The authors study the stability and convergence properties of recurrent high-order neural networks (RHONNs) as models of nonlinear dynamical systems. The overall structure of the RHONN consists of dynamical elements distributed throughout the network in the form of dynamical neurons, which are interconnected by high-order connections. It is shown that if a sufficiently large number of high-order connections between neurons is allowed, the RHONN model is capable of approximating the input-output behavior of general dynamical systems to any degree of accuracy. Based on the linear-in-the-weights property of the RHONN model, the authors have developed identification schemes and derived weight adaptive laws for adjustment of the weights. The convergence and stability properties of these weight adaptive laws have been analyzed. In the case of no modeling error, the state error between the system and the RHONN model converges to zero asymptotically. If modeling errors are present, the sigma -modification is proposed as a method of guaranteeing the stability of the overall scheme. The feasibility of applying these techniques has been demonstrated by considering the identification of a simple rigid robotic system.Keywords
This publication has 17 references indexed in Scilit:
- Second-order recurrent neural networks for grammatical inferencePublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Detection of dim targets in high cluttered background using high order correlation neural networkPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Gradient methods for the optimization of dynamical systems containing neural networksIEEE Transactions on Neural Networks, 1991
- High-order absolutely stable neural networksIEEE Transactions on Circuits and Systems, 1991
- Backpropagation through time: what it does and how to do itProceedings of the IEEE, 1990
- On the approximate realization of continuous mappings by neural networksNeural Networks, 1989
- Neural networks, orientations of the hypercube, and algebraic threshold functionsIEEE Transactions on Information Theory, 1988
- Generalization of back-propagation to recurrent neural networksPhysical Review Letters, 1987
- Absolute Stability of Global Pattern Formation and Parallel Memory Storage by Competitive Neural NetworksPublished by Elsevier ,1987
- Neurons with graded response have collective computational properties like those of two-state neurons.Proceedings of the National Academy of Sciences, 1984