Unified theory of least squares
- 1 January 1973
- journal article
- research article
- Published by Taylor & Francis in Communications in Statistics
- Vol. 1 (1) , 1-8
- https://doi.org/10.1080/03610927208827002
Abstract
Let (Y, Xβ, σ2I) where E(Y)=Xβ and , be the Gauss-Markoff model, where A' denotes the transpose of the matrix A. Further let be astationary point (supposed to exist for all Y) of ; i.e., where its derivative with respect to ß is the zero vector. It is shown that if ; is the BLUE of p'β for every , the linear space generated by the columns of X', and an unbiased estimator of σ2 is f=R(G:X)−R(X), where R(V) denotes the rank of V, then it is necessary and sufficient that M is a symmetric g-inverse of ( ) where U is any summarice matrix such that . The method is valid whether G is singular or not and R(X) is full or not. A simple choice of U is always U=k2I, K¬0.Keywords
This publication has 5 references indexed in Scilit:
- Minimum variance quadratic unbiased estimation of variance componentsJournal of Multivariate Analysis, 1971
- Estimation of variance and covariance components—MINQUE theoryJournal of Multivariate Analysis, 1971
- Estimation of Heteroscedastic Variances in Linear ModelsJournal of the American Statistical Association, 1970
- On Best Linear Estimation and General Gauss-Markov Theorem in Linear Models with Arbitrary Nonnegative Covariance StructureSIAM Journal on Applied Mathematics, 1969
- Weak generalized inverses and minimum variance linear unbiased estimationJournal of Research of the National Bureau of Standards Section B Mathematics and Mathematical Physics, 1964