Performance analysis of locally recurrent neural networks
- 1 December 1998
- journal article
- Published by Emerald Publishing in COMPEL: The International Journal for Computation and Mathematics in Electrical and Electronic Engineering
- Vol. 17 (6) , 708-716
- https://doi.org/10.1108/03321649810221251
Abstract
Many practical applications of neural networks require the identification of non‐linear deterministic systems or chaotic systems. In these cases the use of a network architecture known as locally recurrent neural network (LRNN) is often preferable in place of standard feedforward multi‐layer perceptron (MLP) networks, or of globally recurrent neural network. In this paper locally recurrent networks are used to simulate the behaviour of the Chua’s circuit that can be considered a paradigm for studying chaos. It is shown that such networks are able to identify the underlying link among the state variables of the Chua’s circuit. Moreover, they are able to behave like an autonomous Chua’s double scroll, showing a chaotic behaviour of the state variables obtainable through a suitable circuit elements choice.Keywords
This publication has 16 references indexed in Scilit:
- A unifying view of some training algorithms for multilayer perceptrons with FIR filter synapsesPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Neural network identification of a nonlinear circuitmodel of hysteresisElectronics Letters, 1997
- Neural network modelling of variable hystereticinductorsElectronics Letters, 1996
- Learning a simple recurrent neural state space model to behave like Chua's double scrollIEEE Transactions on Circuits and Systems I: Regular Papers, 1995
- Three steps to chaos. I. EvolutionIEEE Transactions on Circuits and Systems I: Regular Papers, 1993
- FIR and IIR Synapses, a New Neural Network Architecture for Time Series ModelingNeural Computation, 1991
- Global optimization: a naive approachIEEE Transactions on Circuits and Systems, 1990
- Backpropagation through time: what it does and how to do itProceedings of the IEEE, 1990
- Generalization of back-propagation to recurrent neural networksPhysical Review Letters, 1987
- The double scroll familyIEEE Transactions on Circuits and Systems, 1986