Abstract
This paper exploits the ‘‘stochastic inflation’’ paradigm introduced by Starobinskii to study the evolution of long-wavelength modes for a free scalar field Φ in an inflationary Universe. By relaxing the assumption of a ‘‘slow roll,’’ it becomes obvious that the well-known late-time infrared divergence of the vacuum for a massless field in de Sitter space may be viewed as a consequence of the fluctuation-dissipation theorem. This stochastic model is also extended to allow for nonvacuum states and power-law inflation, situations where the fluctuation-dissipation theorem no longer holds. One recovers the correct late-time form for the expectation value 〈Φ2〉 in these cases as well, corroborating thereby the intuitive picture that, quite generally, the long-wavelength modes of the field evolve in a thermal ‘‘bath’’ provided by the shorter-wavelength modes.