Perceptrons with polynomial post-processing

Abstract
We introduce tensor product neural networks, composed of a layer of univariate neurons followed by a net of polynomial post-processing. We look at the general approximation properties of these networks observing in particular their relationship to the Stone-Weierstrass theorem for uniform function algebras. The implementation of the post-processing as a two-layer network, with logarithmic and exponential neurons leads to potentially important ‘generalized’ product networks, which however require a complex approximation theory of Müntz-Szasz-Ehrenpreis type. A back-propagation algorithm for product networks is presented and used in three computational experiments. In particular, approximation by a sigmoid product network is compared to that of a single layer radial basis network, and a multiple layer sigmoid network. An additional experiment is conducted, based on an operational system, to further demonstrate the versatility of the architecture.