# Probability of an independent Poisson process overtaking another

I have asked this question before in another fashion on other stackexchanges, so sorry for the somewhat repost.

I have asked my professor and a couple of PhD students about, without a definitive answer. I will first state the problem, then my potential solution and the problem with my solution, so sorry for the wall of text.

The problem:

Assume two independent Poisson processes $M$ and $R$, with $\lambda_R$ and $\lambda_M$ for the same interval, subject to $\lambda_R>\lambda_M$. What is the probability that at any point in time, as time tends to infinity, that the aggregate output of process $M$ is larger than the aggregate output of process $R$ plus $D$, i.e. $P(M>R+D)$. To illustrate with an example, assume two bridges $R$ and $M$, on average $\lambda_R$ and $\lambda_M$ cars drive over the bridge $R$ and $M$ respectively per interval, and $\lambda_R>\lambda_M$. $D$ cars have already driven over bridge $R$, what is the probability that at any point in time more cars in total have driven over bridge $M$ than $R$.

My way of solving this problem:

First we define two Poisson processes:

The next step is to find a function that describes $P(M>R+D)$ after a given number of intervals $I$. This will happen in case $M(I)>k+D$ conditional on the output of $R(I)=k$, for all non-negative values of $k$. To illustrate, if the aggregate output of $R$ is $X$ then the aggregate output of $M$ needs to be larger than $X+D$. As shown below.

Due to independence this can be rewritten as the product of the two elements, where the first element is 1-CDF of the Poisson distribution and the second element is the Poisson pmf:

To create an example, assume $D=6$, $\lambda_R=0.6$ and $\lambda_M=0.4$, below is the graph of that function over $I$: The next step is to find the probability of this happening at any point in time, lets call that $Q$. My thought is that this is equivalent to finding 1 minus the probability of $M$ never being above $R+D$. I.e let $N$ approach infinity what is $P(R(N)+D \geq M(N))$ conditional on this also being true for all previous values of $N$.

$P(R(I)+D \geq M(I))$ is the same as $1-P(M(I)>R(I)+D)$, lets define that as function g(I):

As $N$ tends to infinity, this can also be rewritten as the geometrical integral over function $g(I)$.

Where we have the function of $P(M(I)>R(I)+D)$ from above.

Now to me this should give me the final value of $Q$, for any given $D$, $\lambda_R$ and $\lambda_M$. However, there is a problem, we should be able to rewrite the lambdas as we want as the only thing that should matter is their proportion to each other. To build on the example from before with $D=6$, $\lambda_R=0.6$ and $\lambda_M=0.4$, this is effectively the same as $D=6$, $\lambda_R=0.06$ and $\lambda_M=0.04$, as long as their interval is divided by 10. I.e. 10 cars every 10 minutes is the same as 1 car every minute. However, doing this produces a different result. $D=6$, $\lambda_R=0.6$ and $\lambda_M=0.4$ yields a $Q$ of $0.5856116$ and $D=6$, $\lambda_R=0.06$ and $\lambda_M=0.04$ yields a $Q$ of $0.9998507$. The immediate realization is that $1-(1-0.5856116)^{10}=0.9998507$, and the reason is actually fairly simple if we compare the graphs of the two outcomes, the graph below shows the function for $D=6$, $\lambda_R=0.06$ and $\lambda_M=0.04$. As can be seen the probability does not change, however it now takes ten times as many intervals to get to the same probability. As $Q$ is dependent on the interval of the function this naturally has an implication. This obviously means that something is wrong, as the result should not depend on my starting lambda, especially because there is no starting lambda that is correct $0.04$ and $0.06$ is as correct as $0.4$ and $0.6$ or $1$ and $1.5$ etc, as long as the interval is scaled accordingly. Therefore, while I can easily scale the probability, i.e. going from $0.4$ and $0.6$ to $0.04$ and $0.06$ is the same as scaling the probability with a factor of 10. This obviously produce the same result, but as all of these lambdas are equally valid starting points, then this is obviously not correct.

To show this impact I graphed $Q$ as a function of $t$, where $t$ is a scaling factor of the lambdas, with starting lambdas of $\lambda_M=0.4$ and $\lambda_R=\lambda_M\cdot 1.5$. The output can be seen in the graph below: This is where I am stuck, to me the approach looks fine and correct, but the result is obviously wrong. My initial thought is that I am missing a fundamental re-scale somewhere, but I can’t for the life of me figure out where.

Thank you for reading, any and all help is greatly appreciated.

Additionally, if anyone wants my R-code please let me know and I will upload it.

Let the collective times of the processes be $$T=(0=t0 Because these are independent Poisson processes, almost surely exactly one of them is observed at each of these times. For $$i>0,i\gt 0,$$ define

$$B(i)={+1if R(ti)=1−1if M(ti)=1B(i) = \left\{\matrix{+1 & \text{if } R(t_i)=1 \\ -1 & \text{if }M(t_i)=1}\right.$$

and accumulate the $$B(i)B(i)$$ into the process $$W:W:$$ that is, $$W(0)=0W(0)=0$$ and $$W(i+1)=W(i)+B(i)W(i+1)=W(i)+B(i)$$ for all $$i>0.i \gt 0.$$ $$W(i)W(i)$$ counts how many more times $$RR$$ has appeared than $$MM$$ just after time $$ti.t_i.$$ This figure shows realizations of $$RR$$ (in red) and $$MM$$ (in medium blue) as “rug plots” at the top. The points plot the values of $$(ti,W(i))(t_i, W(i))$$. Each red point represents an increase in the excess $$R(ti)−M(ti)R(t_i)-M(t_i)$$ while each blue point shows a decrease in the excess.

For $$b=0,1,2,…,b=0, 1, 2, \ldots,$$ let $$Eb\mathcal{E}_b$$ be chance at least one of the $$WiW_i$$ is less than or equal to $$−b-b$$ and let $$f(b)f(b)$$ be its probability.

The question asks for $$f(D+1).f(D+1).$$

Let $$λ=λR+λM.\lambda=\lambda_R+\lambda_M.$$ This is the rate of the combined processes. $$WW$$ is a Binomial random walk, because

$$Pr\Pr(B(i) = 1) = \frac{\lambda_R}{\lambda}\text{ and } \Pr(B(i)=-1) = \frac{\lambda_M}{\lambda}.$$

Thus,

The answer equals the chance that this Binomial random walk $$WW$$ encounters an absorbing barrier at $$-D-1.-D-1.$$

The most elementary way to find this chance observes that

$$f(0)=1f(0)=1$$

because $$W(0)=0;W(0)=0;$$ and, for all $$b \gt 0,b \gt 0,$$ the two possible next steps of $$\pm 1\pm 1$$ recursively yield

$$f(b) = \frac{\lambda_R}{\lambda} f(b+1) + \frac{\lambda_M}{\lambda} f(b-1).f(b) = \frac{\lambda_R}{\lambda} f(b+1) + \frac{\lambda_M}{\lambda} f(b-1).$$

Assuming $$\lambda_R \ge \lambda_M,\lambda_R \ge \lambda_M,$$ the unique solution for $$b\ge 0b\ge 0$$ is

$$f(b) = \left(\frac{\lambda_M}{\lambda_R}\right)^b,f(b) = \left(\frac{\lambda_M}{\lambda_R}\right)^b,$$

as you can check by plugging this into the foregoing defining equations. Thus,

The answer is $$\Pr(\mathcal{E}_{D+1})=f(D+1)=\left(\frac{\lambda_M}{\lambda_R}\right)^{D+1}.\Pr(\mathcal{E}_{D+1})=f(D+1)=\left(\frac{\lambda_M}{\lambda_R}\right)^{D+1}.$$