I know, I can’t use convolution.

I have two random variables A and B and they’re dependent.

I need Distributive function of A+B

**Answer**

As vinux points out, one needs the joint distribution of A and B, and

it is not obvious from OP Mesko’s response “I know Distributive function of A and B”

that he is saying he knows the *joint* distribution of A and B: he may well

be saying that he knows the marginal distributions of A and B. However,

assuming that Mesko does know the joint distribution, the answer is given below.

From the convolution integral in OP Mesko’s comment (which is wrong, by the way), it could be inferred that

Mesko is interested in *jointly continuous* random variables A and B with joint probability density function fA,B(a,b). In this case,

fA+B(z)=∫∞−∞fA,B(a,z−a)da=∫∞−∞fA,B(z−b,b)db.

When A and B are independent, the joint density function factors into the

product of the marginal density functions: fA,B(a,z−a)=fA(a)fB(z−a)

and we get the more familiar

convolution formula for independent random variables. A similar result

applies for discrete random variables as well.

Things are more complicated if A and B are not jointly continuous, or

if one random variable is continuous and the other is discrete. However,

in all cases, one can always find the cumulative probability distribution

function FA+B(z) of A+B as the total probability mass in the region of

the plane specified as {(a,b):a+b≤z} and compute the probability

density function, or the probability mass function, or whatever, from the

distribution function. Indeed the above formula is obtained by writing

FA+B(z) as a double integral of the joint density function over the

specified region and then “differentiating under the integral

sign.”

**Attribution***Source : Link , Question Author : Mesko , Answer Author : Dilip Sarwate*