Sensitivity analysis of multilayer perceptron with differentiable activation functions
- 1 January 1992
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 3 (1) , 101-107
- https://doi.org/10.1109/72.105422
Abstract
In a neural network, many different sets of connection weights can approximately realize an input-output mapping. The sensitivity of the neural network varies depending on the set of weights. For the selection of weights with lower sensitivity or for estimating output perturbations in the implementation, it is important to measure the sensitivity for the weights. A sensitivity depending on the weight set in a single-output multilayer perceptron (MLP) with differentiable activation functions is proposed. Formulas are derived to compute the sensitivity arising from additive/multiplicative weight perturbations or input perturbations for a specific input pattern. The concept of sensitivity is extended so that it can be applied to any input patterns. A few sensitivity measures for the multiple output MLP are suggested. For the verification of the validity of the proposed sensitivities, computer simulations have been performed, resulting in good agreement between theoretical and simulation outcomes for small weight perturbations.Keywords
This publication has 6 references indexed in Scilit:
- A simple procedure for pruning back-propagation trained neural networksIEEE Transactions on Neural Networks, 1990
- Sensitivity of feedforward neural networks to weight errorsIEEE Transactions on Neural Networks, 1990
- Statistical sensitivity and minimum sensitivity structures with fewer coefficients in discrete time linear systemsIEEE Transactions on Circuits and Systems, 1990
- On the approximate realization of continuous mappings by neural networksNeural Networks, 1989
- Sensitivity analysis in neural net solutionsIEEE Transactions on Systems, Man, and Cybernetics, 1989
- Parallel Distributed ProcessingPublished by MIT Press ,1986