Probabilistic data analysis: an introductory guide
- 1 April 1998
- journal article
- Published by Wiley in Journal of Microscopy
- Vol. 190 (1-2) , 28-36
- https://doi.org/10.1046/j.1365-2818.1998.2780835.x
Abstract
Quantitative science requires the assessment of uncertainty, and this means that measurements and inferences should be described as probability distributions. This is done by building data into a probabilistic likelihood function which produces a posterior ‘answer’ by modulating a prior ‘question’.Probability calculus is the only way of doing this consistently, so that data can be included gradually or all at once while the answer remains the same. However, probability calculus is only a language; it does not restrict the questions one can ask by setting one's prior. We discuss how to set sensible priors, in particular for a large problem like image reconstruction.We also introduce practical modern algorithms (Gibbs sampling, Metropolis algorithm, genetic algorithms, and simulated annealing) for computing probabilistic inference.Keywords
This publication has 26 references indexed in Scilit:
- Prior Distributions on Measure SpaceJournal of the Royal Statistical Society Series B: Statistical Methodology, 1997
- Inference from Iterative Simulation Using Multiple SequencesStatistical Science, 1992
- Bayesian computational methodsPhilosophical Transactions A, 1991
- Sampling-Based Approaches to Calculating Marginal DensitiesJournal of the American Statistical Association, 1990
- Simulated annealing: An introductionStatistica Neerlandica, 1989
- Maximum entropy theoryActa Crystallographica Section A Foundations of Crystallography, 1985
- Optimization by Simulated AnnealingScience, 1983
- Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropyIEEE Transactions on Information Theory, 1980
- The Maximum Entropy MethodPublished by Springer Nature ,1979
- Monte Carlo sampling methods using Markov chains and their applicationsBiometrika, 1970