Mean Field Dynamics of Boson Stars
Preprint
- 15 April 2005
Abstract
We consider a quantum mechanical system of N bosons with relativistic dispersion interacting through a mean field Coulomb potential (attractive or repulsive). We choose the initial wave function to describe a condensate, where the N bosons are all in the same one-particle state. Starting from the N-body Schroedinger equation, we prove that, in the limit N goes to infinity, the time evolution of the one-particle density is governed by the relativistic nonlinear Hartree equation. This equation is used to describe the dynamics of boson stars (Chandrasekhar theory). The corresponding static problem was rigorously solved by Lieb and Yau.Keywords
All Related Versions
This publication has 0 references indexed in Scilit: