Tuning the Structure and Parameters of a Neural Network by Using Hybrid Taguchi-Genetic Algorithm
Top Cited Papers
- 13 February 2006
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 17 (1) , 69-80
- https://doi.org/10.1109/tnn.2005.860885
Abstract
In this paper, a hybrid Taguchi-genetic algorithm (HTGA) is applied to solve the problem of tuning both network structure and parameters of a feedforward neural network. The HTGA approach is a method of combining the traditional genetic algorithm (TGA), which has a powerful global exploration capability, with the Taguchi method, which can exploit the optimum offspring. The Taguchi method is inserted between crossover and mutation operations of a TGA. Then, the systematic reasoning ability of the Taguchi method is incorporated in the crossover operations to select the better genes to achieve crossover, and consequently enhance the genetic algorithms. Therefore, the HTGA approach can be more robust, statistically sound, and quickly convergent. First, the authors evaluate the performance of the presented HTGA approach by studying some global numerical optimization problems. Then, the presented HTGA approach is effectively applied to solve three examples on forecasting the sunspot numbers, tuning the associative memory, and solving the XOR problem. The numbers of hidden nodes and the links of the feedforward neural network are chosen by increasing them from small numbers until the learning performance is good enough. As a result, a partially connected feedforward neural network can be obtained after tuning. This implies that the cost of implementation of the neural network can be reduced. In these studied problems of tuning both network structure and parameters of a feedforward neural network, there are many parameters and numerous local optima so that these studied problems are challenging enough for evaluating the performances of any proposed GA-based approaches. The computational experiments show that the presented HTGA approach can obtain better results than the existing method reported recently in the literature.Keywords
This publication has 27 references indexed in Scilit:
- Smooth Function Approximation Using Neural NetworksIEEE Transactions on Neural Networks, 2005
- A Generalized Growing and Pruning RBF (GGAP-RBF) Neural Network for Function ApproximationIEEE Transactions on Neural Networks, 2005
- Efficient Learning Algorithms for Three-Layer Regular Feedforward Fuzzy Neural NetworksIEEE Transactions on Neural Networks, 2004
- Multiscale Approximation With Hierarchical Radial Basis Functions NetworksIEEE Transactions on Neural Networks, 2004
- Tuning of the structure and parameters of a neural network using an improved genetic algorithmIEEE Transactions on Neural Networks, 2003
- Hybridizing genetic algorithms with hill-climbing methods for global optimization: two possible waysPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- A simplex genetic algorithm hybridPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- An orthogonal genetic algorithm with quantization for global numerical optimizationIEEE Transactions on Evolutionary Computation, 2001
- Combining mutation operators in evolutionary programmingIEEE Transactions on Evolutionary Computation, 1998
- Genetic Algorithms + Data Structures = Evolution ProgramsPublished by Springer Nature ,1996