Forecasting Using Relative Entropy

Abstract
The paper describes a relative entropy procedure for imposing moment restrictions on simulated forecast distributions from a variety of models. Starting from an empirical forecast distribution for some variables of interest, the technique generates a new empirical distribution that satisfies a set of moment restrictions. The new distribution is chosen to be as close as possible to the original in the sense of minimizing the associated Kullback-Leibler Information Criterion, or relative entropy. The authors illustrate the technique by using several examples that show how restrictions from other forecasts and from economic theory may be introduced into a model's forecasts.