What does it mean to say that an event “happens eventually”?

Consider a 1 dimensional random walk on the integers $\mathbb{Z}$ with initial state $x\in\mathbb{Z}$:

\begin{equation}
S_n=x+\sum^n_{i=1}\xi_i
\end{equation}

where the increments $\xi_i$ are I.I.D such that $P\{\xi_i=1\}=P\{\xi_i=-1\}=\frac{1}{2}$.

One can prove that (1)

\begin{equation}
P^x{\{S_n \text{ reaches +1 eventually}\}} = 1
\end{equation}

where the subscript denotes the initial position.

Let $\tau$ be the first passage time to state $+1$. In other words, $\tau:=\tau(1):=\min\{n\geq0:S_n=1\}$. One can also prove that (2)

\begin{equation}
E\tau = +\infty
\end{equation}

Both proofs can be found in http://galton.uchicago.edu/~lalley/Courses/312/RW.pdf. Through reading the article, I do understand both proofs.

My question is, however, what the meaning of “eventually” is in the first statement as well as in general. If something happens “eventually”, it doesn’t have to occur in finite time, does it? If so, what really is the difference between something which doesn’t happen and something that doesn’t happen “eventually”? Statements (1) and (2) in some sense are contradicting themselves to me. Are there other examples like this?


EDIT

Just want to add a motivation for the question, i.e., a straightforward example of something that happens “eventually”, but with finite expected wait time.

\begin{split}
P\{\text{walker eventually moves left}\} & = 1 – P\{\text{walker never moves left}\} \\
& = 1-\lim_{n\to\infty} \frac{1}{2^n} \\
& = 1
\end{split}

Therefore we know that the walker will “eventually” move to the left, and the expected wait time before doing so (i.e., moving left) is $1/(1/2)=2$.

Seeing something that happens “eventually” but with infinite expected “wait time” was quite a stretch for my imagination. The second half of @whuber’s response is another great example.

Answer

How would you demonstrate an event “eventually happens”? You would conduct a thought experiment with a hypothetical opponent. Your opponent may challenge you with any positive number $p$. If you can find an $n$ (which most likely depends on $p$) for which the chance of the event happening by time $n$ is at least $1-p$, then you win.

In the example, “$S_n$” is misleading notation because you use it both to refer to one state of a random walk as well as to the entire random walk itself. Let’s take care to recognize the distinction. “Reaches $1$ eventually” is meant to refer to a subset $\mathcal{S}$ of the set of all random walks $\Omega$. Each walk $S\in\Omega$ has infinitely many steps. The value of $S$ at time $n$ is $S_n$. “$S$ reaches $1$ by time $n$” refers to the subset of $\Omega$ of walks that have reached the state $1$ by time $n$. Rigorously, it is the set

$$\Omega_{1,n} = \{S\in\Omega \mid S_1=1\text{ or }S_2=1\text{ or }\cdots\text{ or }S_n=1\}.$$

In your response to the imaginary opponent, you are exhibiting some $\Omega_{1,n}$ with the property that

$$\mathbb{P}_\xi(\Omega_{1,n}) \ge 1-p.$$

Because $n$ is arbitrary, you have available all elements of the set

$$\Omega_{1,\infty} = \bigcup_{n=1}^\infty \Omega_{1,n}.$$

(Recall that $S \in \bigcup_{n=1}^\infty \Omega_{1,n}$ if and only if there exists a finite $n$ for which $S \in \Omega_{1,n}$, so there aren’t any infinite numbers involved in this union.)

Your ability to win the game shows this union has a probability exceeding all values of the form $1-p$, no matter how small $p\gt 0$ may be. Consequently, that probability is at least $1$–and therefore equals $1$. You will have demonstrated, then, that

$$\mathbb{P}_\xi(\Omega_{1,\infty}) = 1.$$


One simple way to appreciate the distinction between “happening eventually” and having an infinite expected first passage time is to contemplate a simpler situation. For $n$ any natural number, let $\omega^{(n)}$ be the sequence

$$\omega^{(n)} = (\underbrace{0, 0,\ldots,0}_{n},1,1,\ldots)$$

in which $n$ zeros are followed by an endless string of ones. In other words, these are the walks that stay at the origin and at some (finite) time step over to the point $1$, then stay there forever.

Let $\Omega$ be the set of all these $\omega^{(n)}, n=0, 1, 2, \ldots$ with the discrete sigma algebra. Assign a probability measure via

$$\mathbb{P}(\omega^{(n)}) = \frac{1}{n+1} – \frac{1}{n+2} = \frac{1}{(n+1)(n+2)}.$$

This was designed to make the chance of jumping to $1$ by the time $n$ equal to $1-1/(n+1)$, which obviously approaches arbitrarily closely to $1$. You will win the game. The jump eventually happens and when it does, it will be at some finite time. However, the expected time when it happens is the sum of the survival function (which gives the chances of not having jumped at time $n$),

$$\mathbb{E}(\tau) = \frac{1}{1} + \frac{1}{2} + \frac{1}{3} + \cdots,$$

which diverges. That is because a relatively large probability is given to waiting a long time before jumping.

Attribution
Source : Link , Question Author : Ye Tian , Answer Author : whuber

Leave a Comment