Learning in the multiple class random neural network
- 10 December 2002
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 13 (6) , 1257-1267
- https://doi.org/10.1109/tnn.2002.804228
Abstract
Spiked recurrent neural networks with "multiple classes" of signals have been recently introduced by Gelenbe and Fourneau (1999), as an extension of the recurrent spiked random neural network introduced by Gelenbe (1989). These new networks can represent interconnected neurons, which simultaneously process multiple streams of data such as the color information of images, or networks which simultaneously process streams of data from multiple sensors. This paper introduces a learning algorithm which applies both to recurrent and feedforward multiple signal class random neural networks (MCRNNs). It is based on gradient descent optimization of a cost function. The algorithm exploits the analytical properties of the MCRNN and requires the solution of a system of nC linear and nC nonlinear equations (where C is the number of signal classes and n is the number of neurons) each time the network learns a new input-output pair. Thus, the algorithm is of O([nC]/sup 3/) complexity for the recurrent case, and O([nC]/sup 2/) for a feedforward MCRNN. Finally, we apply this learning algorithm to color texture modeling (learning), based on learning the weights of a recurrent network directly from the color texture image. The same trained recurrent network is then used to generate a synthetic texture that imitates the original. This approach is illustrated with various synthetic and natural textures.Keywords
This publication has 27 references indexed in Scilit:
- Low bit-rate video compression with neural networks and temporal subsamplingProceedings of the IEEE, 1996
- Stability of the Random Neural Network ModelNeural Computation, 1990
- Random Neural Networks with Negative and Positive Signals and Product Form SolutionNeural Computation, 1989
- Learning State Space Trajectories in Recurrent Neural NetworksNeural Computation, 1989
- Recurrent Backpropagation and the Dynamical Approach to Adaptive Neural ComputationNeural Computation, 1989
- Simple Parallel Hierarchical and Relaxation Algorithms for Segmenting Noncausal Markovian Random FieldsPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1987
- Modeling and Segmentation of Noisy and Textured Images Using Gibbs Random FieldsPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1987
- Markov Random Field Texture ModelsPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1983
- Experiments in the Visual Perception of TextureScientific American, 1975
- Inability of Humans to Discriminate between Visual Textures That Agree in Second-Order Statistics—RevisitedPerception, 1973