Abstract
A discussion of the history of thermometer exposure in the United States in the nineteenth and early twentieth century documents a wide variety of exposure types. The different exposures have likely contributed to heterogeneities in datasets that have been assumed to be relatively homogeneous. A field experiment was performed in order to determine the difference in temperature reading as measured in a cotton region shelter (CRS) and an unscreened thermometer in a “north-wall” exposure, which was the most common type of thermometer exposure in the United States during the nineteenth century. Systematic differences in temperature between the two sites throughout the year are likely related to the different responses of each site to the annual cycle of the radiation budget. The average annual temperature (calculated from max +min/2) is about 0.5°C higher in the CRS, and if representative for the United States as a whole means, that nineteenth century temperature records have a cold bias due to differences in thermometer exposure. However, a definite conclusion cannot yet be made due to evidence of significant positive biases in some nineteenth century U.S. temperature records.