If X and Y are independent random variables, then the variance
of the product XY is given byV(XY)={E(X)}2V(Y)+{E(Y)}2V(X)+V(X)V(Y)
If X and y are independent matrix and vector
of m×m and m×1 dimension respectively, then what would be the variance
of the product Xy?My Attempt
V(Xy)=E(X)V(y){E(X)}′+{E(y)⊗Im}′V{vec(X)}{E(y)⊗Im}+V{vec(X)}{V(y)⊗Im}
I know this is not right, at least the last term is wrong. I’d highly appreciate if you give me the right identity or point out any reference. Thanks in advance for your help and time.
Answer
I’ll assume that the elements of y are i.i.d. and likewise for the elements of X. This is important, though, so be forewarned!

The diagonal elements of the covariance matrix equal the sum of m products of i.i.d. random variates, so the variance will equal mV(xijyj), which variance you have above in your first row.

The offdiagonal elements all equal zero, as the rows of X are independent. To see this, without loss of generality assume Exij=Eyi=0 ∀i,j. Define xi as the ith row of X, transposed to be a column vector. Then:
Cov(xTiy,xTjy)=E(xTiy)T(xTjy)=EyTxixTjy=EyExyTxixTjy
Note that xixTj is a matrix, the (p,q)th element of which equals xipxjq. When i≠j, the expectation with respect to x of yTxixTjy equals 0 for any y, as each element is just the expectation of the product of two independent r.v.s with mean 0 times ypyq. Consequently, the entire expectation equals 0.
Attribution
Source : Link , Question Author : MYaseen208 , Answer Author : jbowman