Abstract
Background: Various forms of least-squares regression analyses are used to estimate average systematic error (bias) and its confidence interval in method-comparison studies. When assumptions that underlie a particular regression method are inappropriate for the data, errors in estimated statistics result. In this report, I present an improved method for regression analysis that is free from the usual simplifying assumptions and is generally applicable to linearly related method-comparison data. Methods: Theoretical equations based on the Deming approach, further developed by physicists and extended herein, were applied to method-comparison data analysis. Monte Carlo simulations were used to demonstrate the validity of the new procedure and to compare its performance to ordinary linear regression (OLR) and simple Deming regression (SDR) procedures. Results: Simulation studies included three types of data commonly encountered in method-comparison studies: (a) constant within-method SDs for both methods, (b) constant within-method CVs for both methods, and (c) neither SDs nor CVs constant for both methods. For all cases examined, OLR produced unreliable confidence intervals of the estimated bias. However, OLR point estimates of systematic bias were reliable when the correlation coefficient was >0.975. SDR produced reliable estimates of systematic bias for all cases studied, but the confidence intervals of systematic bias were unreliable when SDs of methods varied as a function of analyte concentration. Conclusion: Only iteratively reweighted general Deming regression produced statistically unbiased estimates of systematic bias and reliable confidence intervals of bias for all cases.