In an errors-in-variables regression model, the least squares estimate is generally inconsistent for the complete regression parameter but can be consistent for certain linear combinations of this parameter. We explore the conjecture that, when the least squares estimate is consistent for a linear combination of the regression parameter, it will be preferred to an errors-in-variables estimate, at least asymptotically. The conjecture is false, in general, but it is true for some important classes of problems. One such problem is a randomized two-group analysis of covariance, upon which we focus.