The Chebyshev-polynomials-based unified model neural networks for function approximation

Abstract
In this paper, we propose the approximate transformable technique, which includes the direct transformation and indirect transformation, to obtain a Chebyshev-Polynomials-Based (CPB) unified model neural networks for feedforward/recurrent neural networks via Chebyshev polynomials approximation. Based on this approximate transformable technique, we have derived the relationship between the single-layer neural networks and multilayer perceptron neural networks. It is shown that the CPB unified model neural networks can be represented as a functional link networks that are based on Chebyshev polynomials, and those networks use the recursive least square method with forgetting factor as learning algorithm. It turns out that the CPB unified model neural networks not only has the same capability of universal approximator, but also has faster learning speed than conventional feedforward/recurrent neural networks. Furthermore, we have also derived the condition such that the unified model generating by Chebyshev polynomials is optimal in the sense of error least square approximation in the single variable ease. Computer simulations show that the proposed method does have the capability of universal approximator in some functional approximation with considerable reduction in learning time.

This publication has 34 references indexed in Scilit: