Abstract
A combined analytic and numerical study of the effect of a plasma density gradient on the beam-plasma instability is presented. After a thorough discussion of the qualitative aspects of this problem, emphasis is placed on the development of a theory which can reveal the interplay between the gradient and the multidimensional aspects of the instability. Using fluid equations, a differential equation is derived for the electrostatic potential. An integral representation for the solution of this equation is derived and its asymptotic evaluation presented. Explicit comparison is made between this asymptotic result and a direct numerical integration of the basic equations. The predictions of the theory are presented for the two-dimensional case, both for the case in which the beam propagates along the direction of the gradient and for that in which the beam is propagating at an angle to this direction. Some brief remarks are made concerning the fully three-dimensional case.