Abstract
Summary form only given. The authors have applied a simple back propagation neural network on a very large scale in an attempt to associate many primary sequences with representations of the corresponding three-dimensional structures. The training set consisted of 25 five sequences (the input layer, 130 amino acids long) associated with 25 130*130 distance matrices (the output layer, 16900 neurons). Each amino acid was coded according to its hydrophobicity (range +or-1; the degree to which it avoids contact with water), and the Euclidean distances in the distance matrices were normalized to the largest distance in the training set (range 0-1; about 40 A). The network was configured with a single fully connected hidden layer of 50 to 1000 neurons using the network description language (NDL, also called BigNet). The network simulation was run on a Cray 2 supercomputer with four processors and 512 million words of random access memory. The network achieved rates of two million connections per second in full backpropagation learning mode and was able to learn some aspects of sequence-to-structure mapping.<>

This publication has 0 references indexed in Scilit: