If you have two independent random variables that are normally distributed (not necessarily jointly so), then their

sum is also normally distributed, which e.g. means that its excess kurtosis is 0.On the other hand in case the mixture of one-dimensional normal distributions the mixture distribution can display non-trivial higher-order moments such as skewness and kurtosis (fat tails) and multi-modality, even in the absence of such features within the components themselves (see also this video for an easy enough example). This means that the result

doesn’t have to be normally distributed.

My question

How can you reconcile these two results and what is their connection?

**Answer**

It’s important to make the distinction between a sum of normal random variables and a mixture of normal random variables.

As an example, consider *independent* random variables X_1\sim N(\mu_1,\sigma_1^2), X_2\sim N(\mu_2,\sigma_2^2), \alpha_1\in\left[0,1\right], and \alpha_2=1-\alpha_1.

Let Y=X_1+X_2. Y is the sum of two independent normal random variables. What’s the probability that Y is less than or equal to zero, P(Y\leq0)? It’s simply the probability that a N(\mu_1+\mu_2,\sigma_1^2+\sigma_2^2) random variable is less than or equal to zero because the sum of two independent normal random variables is another normal random variable whose mean is the sum of the means and whose variance is the sum of the variances.

Let Z be a mixture of X_1 and X_2 with respective weights \alpha_1 and \alpha_2. Notice that Z\neq \alpha_1X_1+\alpha_2X_2. The fact that Z is defined as a mixture with those specific weights means that the CDF of Z is F_Z(z)=\alpha_1F_1(z)+\alpha_2F_2(z), where F_1 and F_2 are the CDFs of X_1 and X_2, respectively. So what is the probability that Z is less than or equal to zero, P(Z\leq0)? It’s F_Z(0)=\alpha_1F_1(0)+\alpha_2F_2(0).

**Attribution***Source : Link , Question Author : vonjd , Answer Author : Dilip Sarwate*