[This article was first published on R – Xi'an's Og, and kindly contributed to R-bloggers]. (You can report issue about the content on this page here)
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.
Although it may sound like an excessive notion of optimality, one can hope at obtaining an estimator δ of a unidimensional parameter θ that is always closer to θ that any other parameter. In distribution if not almost surely, meaning the cdf of (δ-θ) is steeper than for other estimators enjoying the same cdf at zero (for instance ½ to make them all median-unbiased). When I saw this question on X validated, I thought of the Cauchy location example, where there is no uniformly optimal estimator, albeit a large collection of unbiased ones. But a simulation experiment shows that the MLE does better than the competition. At least than three (above)four of them (since I tried the Pitman estimator via Christian Henning’s smoothmest R package). The differences to the MLE empirical cd make it clearer below (with tomato for a score correction, gold for the Pitman estimator, sienna for the 38% trimmed mean, and blue for the median):I wonder at a general theory along these lines. There is a vague similarity with Pitman nearness or closeness but without the paradoxes induced by this criterion. More in the spirit of stochastic dominance, which may be achievable for location invariant and mean unbiased estimators…
To leave a comment for the author, please follow the link and comment on their blog: R – Xi'an's Og.