Most of the time, sticking to a pure Bayesian approach to statistics with proper priors, leads to admissible estimators.

Nevertheless, there is a good reason to use Empirical Bayes in many cases, and the frequentists are enjoying better accuracy with Ridge Regression, Lasso, etc. using cross validation.

My question is: what are the conditions to satisfy in order to make sure an Empirical Bayes approach leads to admissible estimators without using rigorous mathematics?

For a practitioner, assuming that one has the perfect structure of the likelihood (given to us by an oracle), are there some eligibility criteria on which parameters of prior distributions can be estimated from the data without risking inadmissibility? In another words, is there a step-by-step methodology/guideline to build hierarchical models where some (hyper)parameters will be estimated from data and avoid inadmissible estimation?

Recently, I have found a starting point:

“Choice of hierarchical priors: admissibility in estimation of normal means”

**Answer**

**Attribution***Source : Link , Question Author : Cagdas Ozgenc , Answer Author : Community*