# What are the regularity conditions for Likelihood Ratio test

Could anyone please tell me what the regularity conditions are for the asymptotic distribution of Likelihood Ratio test?

Everywhere I look, it is written ‘Under the regularity conditions’ or ‘under the probabilistic regularities’. What are the conditions exactly? That the first and second log-likelihood derivatives exist and Information matrix not be zero? Or something else entirely?

The required regularity conditions are listed in most intermediate textbooks and are not different than those of the mle. The following ones concern the one parameter case yet their extension to the multiparameter one is straightforward.

Condition 1: The pdfs are distinct, i.e. $\theta \neq \theta ^{\prime} \Rightarrow f(x_i;\theta)\neq f(x_i;\theta ^{\prime})$

Note that this condition essentially states that the parameter identifies the pdf.

Condition 2: The pdfs have common support for all $\theta$

What this implies is that the support does not depend on $\theta$

Condition 3:The point $\theta_0$, the real parameter that is, is an interior point in some set $\Omega$

The last one concerns the possibility that $\theta$ appears in the endpoints of an interval.

These three together guarantee that the likelihood is maximised at the true parameter $\theta_0$ and then that the mle $\hat{\theta}$ that solves the equation

is consistent.

Condition 4: The pdf $f(x;\theta)$ is twice differentiable as a function of $\theta$

Condition 5: The integral $\int_{-\infty}^{\infty} f(x;\theta)\ \mathrm dx$ can be differentiated twice under the integral sign as a function of $\theta$

We need the last two to derive the Fisher Information which plays a central role in the theory of convergence of the mle.

For some authors these suffice but if we are to be thorough we additionally need a final condition that ensures the asymptotic normality of the mle.

Condition 6:The pdf $f(x;\theta)$ is three times differentiable as a function of $\theta$. Further for all $\theta \in \Omega$, there exist a constant $c$ and a function $M(x)$ such that

with $E_{\theta_0} \left[M(X)\right] <\infty$ for all $|\theta-\theta_0| and all $x$ in the support of $X$

Essentially the last condition allows us to conclude that the remainder of a second order Taylor expansion about $\theta_0$ is bounded in probability and thus poses no problem asymptotically.

Is that what you had in mind?