# When does $X_n\stackrel{d}{\rightarrow}X$ and $Y_n\stackrel{d}{\rightarrow}Y$ imply $X_n+Y_n\stackrel{d}{\rightarrow}X+Y$?

The question:

$$X_n\stackrel{d}{\rightarrow}X$$ and $$Y_n\stackrel{d}{\rightarrow}Y \stackrel{?}{\implies} X_n+Y_n\stackrel{d}{\rightarrow}X+Y$$

I know that this does not hold in general; Slutsky’s theorem only applies when one or both of the convergences is in probability.

However, are there instances in which it does hold?

For instance, if the sequences $$X_n$$ and $$Y_n$$ are independent.

Formalizing @Ben answer, independence is almost a sufficient condition, because we know that the characteristic function of the sum of two independent RV’s is the product of their marginal characteristic functions. Let $$Z_n = X_n + Y_n$$. Under independence of $$X_n$$ and $$Y_n$$,

$$\phi_{Z_n}(t) = \phi_{X_n}(t)\phi_{Y_n}(t)$$

So

$$\lim \phi_{Z_n}(t) =\lim \Big [\phi_{X_n}(t)\phi_{Y_n}(t)\Big]$$

and we have (since we assume that $$X_n$$ and $$Y_n$$ converge)

$$\lim \Big [\phi_{X_n}(t)\phi_{Y_n}(t)\Big] = \lim \phi_{X_n}(t)\cdot \lim \phi_{Y_n}(t) = \phi_{X}(t)\cdot \phi_{Y}(t)$$

which is the characteristic function of $$X+Y$$if $$X+Y$$ are independent. And they will be independent if one of the two has a continuous distribution function (see this post). This is the condition required in addition to independence of the sequences, so that independence is preserved at the limit.

Without independence we would have

$$\phi_{Z_n}(t) \neq \phi_{X_n}(t)\phi_{Y_n}(t)$$

and no general assertion can be made about the limit.