I’ve recently encountered the bivariate Poisson distribution, but I’m a little confused as to how it can be derived.

The distribution is given by:

P(X = x, Y = y) = e^{-(\theta_{1}+\theta_{2}+\theta_{0})} \displaystyle\frac{\theta_{1}^{x}}{x!}\frac{\theta_{2}^{y}}{y!} \sum_{i=0}^{min(x,y)}\binom{x}{i}\binom{y}{i}i!\left(\frac{\theta_{0}}{\theta_{1}\theta_{2}}\right)^{i}

From what I can gather, the \theta_{0} term is a measure of correlation between X and Y; hence, when X and Y are independent, \theta_{0} = 0 and the distribution simply becomes the product of two univariate Poisson distributions.

Bearing this in mind, my confusion is predicated on the summation term – I’m assuming this term explains the the correlation between X and Y.

It seems to me that the summand constitutes some sort of product of binomial cumulative distribution functions where the probability of “success” is given by \left(\frac{\theta_{0}}{\theta_{1}\theta_{2}}\right) and the probability of “failure” is given by i!^{\frac{1}{min(x,y)-i}}, because \left(i!^{\frac{1}{min(x,y)-i!}}\right)^{(min(x,y)-i)} = i!, but I could be way off with this.

Could somebody provide some assistance on how this distribution can be derived? Also, if it could be included in any answer how this model might be extended to a multivariate scenario (say three or more random variables), that would be great!

(Finally, I have noted that there was a similar question posted before (Understanding the bivariate Poisson distribution), but the derivation wasn’t actually explored.)

**Answer**

In a slide presentation, Karlis and Ntzoufras define a bivariate Poisson as the distribution of (X,Y)=(X_1+X_0,X_2+X_0) where the X_i independently have Poisson \theta_i distributions. Recall that having such a distribution means

\Pr(X_i=k) = e^{-\theta_i}\frac{\theta_i^k}{k!}

for k=0, 1, 2, \ldots.

The event (X,Y)=(x,y) is the disjoint union of the events

(X_0,X_1,X_2) = (i,x-i,y-i)

for all i that make all three components non-negative integers, from which we may deduce that 0 \le i \le \min(x,y). Because the X_i are independent their probabilities multiply, whence

F_{(\theta_0,\theta_1,\theta_2)}(x,y)=\Pr((X,Y)=(x,y)) \\= \sum_{i=0}^{\min(x,y)} \Pr(X_0=i)\Pr(X_1=x-i)\Pr(X_2=y-i).

**This is a formula; we are done.** But to see that it is equivalent to the formula in the question, use the definition of the Poisson distribution to write these probabilities in terms of the parameters \theta_i and (assuming neither of \theta_1,\theta_2 is zero) re-work it algebraically to look as much as possible like the product \Pr(X_1=x)\Pr(X_2=y):

\eqalign{

F_{(\theta_0,\theta_1,\theta_2)}(x,y)&=

\sum_{i=0}^{\min(x,y)} \left( e^{-\theta_0} \frac{\theta_0^i}{i!}\right) \left( e^{-\theta_1} \frac{\theta_1^{x-i}}{(x-i)!}\right) \left( e^{-\theta_2} \frac{\theta_2^{y-i}}{(y-i)!}\right) \\

&=e^{-(\theta_1+\theta_2)}\frac{\theta_1^x}{x!}\frac{\theta_2^y}{y!}\left(e^{-\theta_0}\sum_{i=0}^{\min(x,y)} \frac{\theta_0^i}{i!}\frac{x!\theta_1^{-i}}{(x-i)!}\frac{y!\theta_2^{-i}}{(y-i)!}\right).

}

If you really want to–it is somewhat suggestive–you can re-express the terms in the sum using the binomial coefficients \binom{x}{i}=x!/((x-i)!i!) and \binom{y}{i}, yielding

F_{(\theta_0,\theta_1,\theta_2)}(x,y) = e^{-(\theta_0+\theta_1+\theta_2)}\frac{\theta_1^x}{x!}\frac{\theta_2^y}{y!}\sum_{i=0}^{\min(x,y)}i!\binom{x}{i}\binom{y}{i}\left(\frac{\theta_0}{\theta_1\theta_2}\right)^i,

exactly as in the question.

Generalization to multivariate scenarios could proceed in several ways, depending on the flexibility needed. The simplest would contemplate the distribution of

(X_1+X_0, X_2+X_0, \ldots, X_d+X_0)

for independent Poisson distributed variates X_0, X_1, \ldots,X_d. For more flexibility additional variables could be introduced. For instance, use independent Poisson \eta_i variables Y_1, \ldots, Y_d and consider the multivariate distribution of the X_i + (Y_i + Y_{i+1} + \cdots + Y_d), i=1, 2, \ldots, d.

**Attribution***Source : Link , Question Author : user9171 , Answer Author : whuber*