Bayesian approach to neural-network modeling with input uncertainty
- 1 January 1999
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 10 (6) , 1261-1270
- https://doi.org/10.1109/72.809073
Abstract
It is generally assumed when using Bayesian inference methods for neural networks that the input data contains no noise or corruption. For real-world (errors in variable) problems this is clearly an unsafe assumption. This paper presents a Bayesian neural-network framework which allows for input noise provided that some model of the noise process exists. In the limit where the noise process is small and symmetric it is shown, using the Laplace approximation, that this method gives an additional term to the usual Bayesian error bar which depends on the variance of the input noise process. Further, by treating the true (noiseless) input as a hidden variable and sampling this jointly with the network weights using a Markov chain Monte Carlo method, it is demonstrated that it is possible to infer the regression over the noiseless inputKeywords
This publication has 15 references indexed in Scilit:
- Accurate navigation via differential GPS and vehicle local sensorsPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- A scatterometer neural network sensor model with input noiseNeurocomputing, 2000
- Estimations of error bounds for RBF networksPublished by Institution of Engineering and Technology (IET) ,1997
- Bayesian Analysis of Errors-in-Variables Regression ModelsPublished by JSTOR ,1995
- On the relationship between Bayesian error bars and the input data densityPublished by Institution of Engineering and Technology (IET) ,1995
- Training with Noise is Equivalent to Tikhonov RegularizationNeural Computation, 1995
- Novelty detection and neural network validationIEE Proceedings - Vision, Image, and Signal Processing, 1994
- Noise injection into inputs in back-propagation learningIEEE Transactions on Systems, Man, and Cybernetics, 1992
- Using additive noise in back-propagation trainingIEEE Transactions on Neural Networks, 1992
- Creating artificial neural networks that generalizeNeural Networks, 1991