The Distribution of High‐Redshift Galaxy Colors: Line‐of‐Sight Variations in Neutral Hydrogen Absorption

Abstract
We model, via Monte Carlo simulations, the distribution of observed U-B, B-V, and V-I galaxy colors in the range 1.75 < z < 5 caused by variations in the line-of-sight opacity due to neutral hydrogen (H I). We also include H I internal to the source galaxies. Even without internal H I absorption, comparison of the distribution of simulated colors with the analytic approximations of Madau et al. reveals systematically different mean colors and scatter. Differences arise in part because we use more realistic distributions of column densities and Doppler parameters. However, there are also mathematical problems of applying mean and standard deviation opacities, and such application yields unphysical results. These problems are corrected using our Monte Carlo approach. Including H I absorption internal to the galaxies generally diminishes the scatter in the observed colors at a given redshift, but for redshifts of interest this diminution only occurs in the colors using the bluest bandpass. Internal column densities less than 1017 cm2 do not affect the observed colors, while column densities greater than 1018 cm2 yield a limiting distribution of high-redshift galaxy colors. As one application of our analysis, we consider the sample completeness as a function of redshift for a single spectral energy distribution (SED) given the multicolor selection boundaries for the Hubble Deep Field proposed by Madau et al. We argue that the only correct procedure for estimating the z > 3 galaxy luminosity function from color-selected samples is to measure the (observed) distribution of redshifts and intrinsic SED types and then consider the variation in color for each SED and redshift. A similar argument applies to the estimation of the luminosity function of color-selected, high-redshift QSOs.
All Related Versions