Abstract
There are many different ways to connect neuron‐like cells into large scale learning networks. These different patterns of connections are called architectures. One problem in designing an architecture is deciding what types of neuron‐like cells to base it on. By examining the properties of learning networks that are independent of their architectures, this paper proposes that there is at least one type of cell which can be used in any reasonable architecture to give it nearly optimal performance. Cells of this type implement an algorithm called second order least mean square (2nd order LMS, for short).

This publication has 0 references indexed in Scilit: