Abstract
In spontaneously broken gauge theories, the gauge symmetry is restored at high temperatures, i.e., in the early universe. As the universe cools, it undergoes a phase transition from the symmetric to asymmetric phase. We examine two interesting cosmological effects of such phase transitions in detail. First, the latent heat released in the SU(2)×U(1)→U(1) phase transition will decrease the previously generated baryon number to entropy ratio. This decrease is calculated as a function of the Higgs meson mass, and the range of Higgs masses for which this decrease is significant (and perhaps unacceptably large) is found. Second, the observation that the present universe has a low vacuum energy density (cosmological constant) implies that, prior to symmetry breakdown, the universe had a very large vacuum energy density, and this large energy density may have had an effect on the expansion of the universe. We show that, in any phase transition, there exists a range of parameters for which the vacuum energy dominates the radiation energy for a short time, causing an exponential cosmic expansion; the corresponding range of Higgs masses in SU(2)×U(1) is found. A suggestion that the large vacuum energy could reverse the collapse of a collapsing universe is investigated, and it is shown that such a "bounce" will not occur within the framework of perturbative gauge theories, unless the vacuum energy always dominates the radiation density before the bounce, a condition not satisfied by our present universe.

This publication has 20 references indexed in Scilit: