Recurrent nets that time and count
Top Cited Papers
- 1 January 2000
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE)
- Vol. 3 (10987576) , 189-194 vol.3
- https://doi.org/10.1109/ijcnn.2000.861302
Abstract
The size of the time intervals between events conveys information essential for numerous sequential tasks such as motor control and rhythm detection. While hidden Markov models tend to ignore this information, recurrent neural networks (RNN) can in principle learn to make use of it. We focus on long short-term memory (LSTM) because it usually outperforms other RNN. Surprisingly, LSTM augmented by "peephole connections" from its internal cells to its multiplicative gates can learn the fine distinction between sequences of spikes separated by either 50 or 49 discrete time steps, without the help of any short training exemplars. Without external resets or teacher forcing or loss of performance on tasks reported earlier, our LSTM variant also learns to generate very stable sequences of highly nonlinear, precisely timed spikes. This makes LSTM a promising approach for real-world tasks that require to time and count.Keywords
This publication has 5 references indexed in Scilit:
- Learning to forget: continual prediction with LSTMPublished by Institution of Engineering and Technology (IET) ,1999
- Long Short-Term MemoryNeural Computation, 1997
- Gradient calculations for dynamic recurrent neural networks: a surveyIEEE Transactions on Neural Networks, 1995
- Adaptive neural oscillator using continuous-time back-propagation learningNeural Networks, 1989
- A Learning Algorithm for Continually Running Fully Recurrent Neural NetworksNeural Computation, 1989