One of the assumptions in a model is the conditional dependence between random variables in the joint prior distribution. Consider the following model,

p(a,b|X)∝p(X|a,b)p(a,b)Now suppose an independence assumption for the prior p(a,b)=p(a)p(b).

Does this assumption imply the posterior has the following conditional dependence as well?

p(a|X)p(b|X)∝p(X|a,b)p(a)p(b)

**Answer**

Your question can also be stated as: “X is dependent on a and b. And a and b are independent. Does this imply that a and b are conditionally independent given X?”

The answer is no. We just need a counter-example to show it isn’t the case. Suppose X=a+b.

Then, once we know X‘s value, a and b are dependent (information about one tells us what the other will be). For example, suppose X=5. Then, if a=3, it tells us that b=2. Similarly, if b=4, it tells a=1.

**Attribution***Source : Link , Question Author : curious_dan , Answer Author : user2522806*