What is the intuition behind the independence of X2−X1X_2-X_1 and X1+X2X_1+X_2, Xi∼N(0,1)X_i \sim N(0,1)?

I was hoping someone could propose an argument explaining why the random variables Y1=X2X1 and Y2=X1+X2, Xi having the standard normal distribution, are statistically independent. The proof for that fact follows easily from the MGF technique, yet I find it extremely counter-intuitive.

I would appreciate therefore the intuition here, if any.

Thank you in advance.

EDIT: The subscripts do not indicate Order Statistics but IID observations from the standard normal distrubution.


This is standard normal distributed data:
scatter plot in first coordinate system
Notice that the distribution is circulary symmetric.

When you switch to Y1=X2X1 and Y2=X1+X2, you effectively rotate and scale the axis, like this:
scatter plot with rotated coordinate system
This new coordinate system has the same origin as the original one, and the axis are orthogonal. Due to the circulary symmetry, the variables are still independent in the new coordinate system.

Source : Link , Question Author : JohnK , Answer Author : dobiwan

Leave a Comment