Let be a random sample from a distribution that has p.d.f (or p.m.f in the discrete case) where is a parameter in the parameter space. Suppose is a sufficient statistic forθ, and let be a complete family. If then is the unique MVUE ofθ.
By theRao–Blackwell theorem, if is an unbiased estimator ofθ then defines an unbiased estimator ofθ with the property that its variance is not greater than that of.
Now we show that this function is unique. Suppose is another candidate MVUE estimator ofθ. Then again defines an unbiased estimator ofθ with the property that its variance is not greater than that of. Then
Since is a complete family
and therefore the function is the unique function of Y with variance not greater than that of any other unbiased estimator. We conclude that is the MVUE.
Example for when using a non-complete minimal sufficient statistic
An example of an improvable Rao–Blackwell improvement, when using a minimal sufficient statistic that isnot complete, was provided by Galili and Meilijson in 2016.[4] Let be a random sample from a scale-uniform distribution with unknown mean and known design parameter. In the search for "best" possible unbiased estimators for, it is natural to consider as an initial (crude) unbiased estimator for and then try to improve it. Since is not a function of, the minimal sufficient statistic for (where and), it may be improved using the Rao–Blackwell theorem as follows:
However, the following unbiased estimator can be shown to have lower variance:
And in fact, it could be even further improved when using the following estimator: