Families of Minimax Estimators of the Mean of a Multivariate Normal Distribution

Abstract
Ever since Stein's result, that the sample mean vector $\mathbf{X}$ of a $k \geqq 3$ dimensional normal distribution is an inadmissible estimator of its expectation $\mathbf{\theta}$, statisticians have searched for uniformly better (minimax) estimators. Unbiased estimators are derived here of the risk of arbitrary orthogonally-invariant and scale-invariant estimators of $\mathbf{\theta}$ when the dispersion matrix $\sum$ of $\mathbf{X}$ is unknown and must be estimated. Stein obtained this result earlier for known $\mathbf{\sum}$. Minimax conditions which are weaker than any yet published are derived by finding all estimators whose unbiased estimate of risk is bounded uniformly by $k$, the risk of $\mathbf{X}$. One sequence of risk functions and risk estimates applies simultaneously to the various assumptions about $\mathbf{\sum}$, resulting in a unified theory for these situations.