Abstract
The determination of soil water content by time domain reflectometry (TDR) involves the measurement of the propagation velocity of a radio frequency (RF) pulse in soil and the conversion of this measurement to an estimate of soil water content. The objective of this study was to quantify errors in this conversion. A comprehensive error analysis can only be carried out if a well‐defined measurement procedure is followed. We define and recommend such a procedure and identify the physical origin and the magnitude of each of the error sources in the procedure. Total measurement errors are calculated as the root mean square sum of each of the individual errors. We express propagation velocity measurements in terms of time intervals, that is, the travel time of an RF pulse in soil (T) relative to that in air (Tair). Our measurement and error calculation procedure is based on an analysis of our own and published data for nonclay soils, which reveals that in all cases, there is a linear relation between T/Tair and volumetric water content (θv) with a mutually independent slope and intercept. A theoretical explanation for this observation is presented. We show that the dominant measurement error source is the transition time of the TDR reflection, which in turn is a function of θv. Absolute measurement errors range from 0.015 to 0.028 m3 m‐3 if the T/Tair vs. θv intercept is known and increase to 0.023 to 0.034 m3 m‐3 if a nominal value (1.55) is used. We recommend that a T/Tair vs. θv slope value of 0.1193 (5% less than theory) be used for measurements in nonclay agricultural soils. This leads to little loss in accuracy. Measurement resolution is primarily a function of the time base error of the TDR instrument and can be as good as 0.0018 m3 m‐3.