FIR and IIR Synapses, a New Neural Network Architecture for Time Series Modeling
- 1 September 1991
- journal article
- Published by MIT Press in Neural Computation
- Vol. 3 (3) , 375-385
- https://doi.org/10.1162/neco.1991.3.3.375
Abstract
A new neural network architecture involving either local feedforward global feedforward, and/or local recurrent global feedforward structure is proposed. A learning rule minimizing a mean square error criterion is derived. The performance of this algorithm (local recurrent global feedforward architecture) is compared with a local-feedforward global-feedforward architecture. It is shown that the local-recurrent global-feedforward model performs better than the local-feedforward global-feedforward model.Keywords
This publication has 1 reference indexed in Scilit:
- Phoneme recognition using time-delay neural networksIEEE Transactions on Acoustics, Speech, and Signal Processing, 1989