Forecasting Using Relative Entropy
Preprint
- 1 November 2002
- preprint
- Published by Elsevier in SSRN Electronic Journal
Abstract
The paper describes a relative entropy procedure for imposing moment restrictions on simulated forecast distributions from a variety of models. Starting from an empirical forecast distribution for some variables of interest, the technique generates a new empirical distribution that satisfies a set of moment restrictions. The new distribution is chosen to be as close as possible to the original in the sense of minimizing the associated Kullback-Leibler Information Criterion, or relative entropy. The authors illustrate the technique by using several examples that show how restrictions from other forecasts and from economic theory may be introduced into a model's forecasts.Keywords
All Related Versions
This publication has 13 references indexed in Scilit:
- Risk Aversion Versus Intertemporal SubstitutionJournal of Business & Economic Statistics, 2001
- Improving Federal-Funds Rate Forecasts in VAR Models Used for Policy AnalysisJournal of Business & Economic Statistics, 2001
- Do Measures of Monetary Policy in a Var Make Sense?International Economic Review, 1998
- An Information-Theoretic Alternative to Generalized Method of Moments EstimationEconometrica, 1997
- Efficient Estimation of Linear Asset-Pricing Models With Moving Average ErrorsJournal of Business & Economic Statistics, 1996
- Empirical Likelihood and General Estimating EquationsThe Annals of Statistics, 1994
- Bayesian Inference in Econometric Models Using Monte Carlo IntegrationEconometrica, 1989
- Forecasting and conditional projection using realistic prior distributionsEconometric Reviews, 1984
- Stochastic Consumption, Risk Aversion, and the Temporal Behavior of Asset ReturnsJournal of Political Economy, 1983
- $I$-Divergence Geometry of Probability Distributions and Minimization ProblemsThe Annals of Probability, 1975