Tensor product neural networks and approximation of dynamical systems

Abstract
We consider the problem of approximating any member of a large class of input-output operators of nonlinear dynamical systems. The systems need not be shift invariant, and the system inputs need not be continuous. We introduce a family of "tensor product" dynamical neural networks, and show that a certain continuity condition is necessary and sufficient for the existence of arbitrarily good approximations using this family.

This publication has 4 references indexed in Scilit: