# Is the theory of minimum variance unbiased estimation overemphasized in graduate school?

Recently I was very embarrassed when I gave an off the cuff answer about minimum variance unbiased estimates for parameters of a uniform distribution that was completely wrong. Fortunately I was immediately corrected by cardinal and Henry with Henry providing the correct answers for the OP.

This got me thinking though. I learned the theory of best unbiased estimators in my graduate math stat class at Stanford some 37 years ago. I have recollections of the Rao-Blackwell theorem, the Cramer – Rao lower bound and the Lehmann-Scheffe Theorem. But as an applied statistician I don’t think very much about UMVUEs in my daily life whereas maximum likelihood estimation comes up a lot.

Why is that? Do we overemphasize the UMVUE theory too much in graduate school? I think so. First of all unbiasedness is not a crucial property. Many perfectly good MLEs are biased. Stein shrinkage estimators are biased but dominate the unbiased MLE in terms of mean square error loss. It is a very beautiful theory (UMVUE estimation), but very incomplete and I think not very useful. What do others think?

We know that

If $X_1,X_2, \dots X_n$ be a random sample from $Poisson(\lambda)$ then for any $\alpha \in (0,1),~T_\alpha =\alpha \bar X+(1-\alpha)S^2$ is an UE of $\lambda$

Hence there exists infinitely many UE’s of $\lambda$. Now a question occur which of these should we choose? so we call UMVUE. Along unbiasedness is not a good property but UMVUE is a good property. But it is not extremely good.

If $X_1,X_2, \dots X_n$ be a random sample from $N(\mu, \sigma^2)$ then minimum MSE estimator of the form $T_\alpha =\alpha S^2$, with $(n-1)S^2=\sum_{i=1}^{n}(X_i-\bar X)^2$ for the parameter $\sigma^2$, is $\frac{n-1}{n+1}S^2=\frac{1}{n+1}\sum_{i=1}^{n}(X_i-\bar X)^2$ But it is biased that is it is not UMVUE though it is best in terms of minimum MSE.

Note that Rao-Blackwell Theorem says that to find UMVUE we can concentrate only on those UE which are function of sufficient statistic that is the UMVUE is the estimator which has minimum variance among all UEs which are function of sufficient statistic. Hence UMVUE is necessarily a function of a sufficient statistic.

MLE and UMVUE both are good from a point of view. But we can never say that one of them is better than other. In statistics we deal with uncertain and random data. So there is always scope for improvement. We may get a better estimator than MLE and UMVUE.

I think we don’t overemphasize the UMVUE theory too much in graduate school.It is purely my personal view. I think graduation stage is a learning stage. So, a graduated student must need to carry a good basis about UMVUE and others estimators,