Standard Problem description
In this question The Frog Problem (puzzle in YouTube video) a frog has to jump from leaf to leaf on a row of leaves. And the question is how long it takes on average to reach the end.
In that specific case the frog only jumps to the leaves in front of him with each leaf having equal probability. It is computed that the expectation value for the number of steps to reach the end is $J_n = \sum_{k=1}^n 1/k $, when the frog has $n$ leaves in front of him.
New extended problem
But what is the solution when the frog can also remain still and go one step backwards. (there are infinitely many leaves behind the frog, the game only ends when the frog has no leaves in front of him)
This would lead to a recurrence relation like: $$J_{n} = (n+1) J_{n1} – n J_{n2}1$$
To make the solution final we need to know $J_0$ and $J_1$.
It is trivial that the expected number of steps for a frog with zero leaves in front of him is 0 ($J_0 = 0$).
But what is $J_1$? What is the expected number of steps for a frog that has only one leaf to go?
Derivation/intuition of the recurrence relation:
$$J_{n+1} = (n+2) J_{n} – (n+1) J_{n1}1$$
There are $n+2$ leaves to go to. The $n$ leaves in front of the frog and the one leaf on which the frog is sitting is the same situation as the frog who has $n1$ leaves in front of him/her. The one leaf backwards results in the frog being in the situation of a frog that has $n+1$ leaves in front of him/her but he took an extra step.
$$J_{n} = \frac{n+1}{n+2} {J_{n1}} + \frac{1}{n+2} {(J_{n+1}+1)}$$
Solution attempt 1
It seems like the solution is close to $J_n = c + \sum_{k=1}^n 1/k$ with some constant $c$… but not exactly. When I fill that expression into the recurrence relation then I get to:
$$\begin{array}{rcl}
J_{n} &=& (n+1) J_{n1} – n J_{n2}1 \\
c + \sum_{k=1}^n \frac{1}{k} &=& (n+1) \left(c + \sum_{k=1}^{n1} \frac{1}{k} \right) – n \left(c + \sum_{k=1}^{n2} \frac{1}{k} \right) 1 \\
\sum_{k=1}^n \frac{1}{k} &=& (n+1) \left(\sum_{k=1}^{n1} \frac{1}{k} \right) – n \left(\sum_{k=1}^{n2} \frac{1}{k} \right) 1 \\
\frac{1}{n} + \frac{1}{n1} &=& (n+1)\frac{1}{n 1} 1\\
\frac{1}{n} + \frac{1}{n1} &=& \frac{2}{n 1} \\
\end{array}$$which is a contradiction.
Solution attempt 2
Simulation by Markov chain (this results into something that looks like $J_n = c + \sum_{k=1}^n 1/k$ but as shown before that can not be true.)
nm < 50 library(expm) for (n in 1:40) { # stochastic Matrix M < pracma::Toeplitz(c(rep(1,nm+1)),c(1,1,rep(0,nm1))) / c(2:(nm+2)) M[1,1:2] < c(1,0) # positions of frogs after k steps V < c(rep(0,n),1,rep(0,nmn)) Vm < sapply(0:nn, FUN = function(k) V %*% (M %^% k)) # mean number of steps by computing 1F(0) E < sum(1Vm[1,]) ev[n] < E } n < 1:40 plot(n,ev,xlim=c(0,20)) title("simulated \n expected number of steps",cex.main=1) H < cumsum(1/n) mod < lm(evH~1) lines(n,H+coef(mod)[1]) coef(mod) legend(0,6.6, c("simulation","2.325547 + H_n") ,cex=0.7, lty=c(NA,1), pch = c(1,NA)) Jn < ev[c(1,2)] Jn1 < ev[c(1,40)] Jn2 < ev[c(39,40)] np < n[c(1,2)] plot(Jn, (np+1)*Jn1  (np) * Jn2 1, xlab = expression(J[n]), ylab = expression((n+1)%*%J[n1]n%*%J[n2]1 )) lines(c(0,10),c(0,10)) title("testing recurrence relation", cex.main=1)
In this answer to the simpler solution. The motion of the frog is not computed by using the recurrence relation, but instead by writing down the probability distribution where the frog might be after $k$ jumps.
In that case the distribution is like a diffusing wave, which will eventually be absorbed completely in the final leaf. In this case we can not compute it because there is a tiny number of frogs that will never reach the goal. But maybe we solve the puzzle with this starting point by finding some explicit solution or by changing the expression to include the backwards leaves?
Answer
Solution for $J_1$
But what is J1? What is the expected number of steps for a frog that has only one leaf to go?
The solution is $J_1 = 2(e1)$ and other terms $J_n$ can be expressed as a sum.
Rewriting the recurrence relation as a sum
The recurrence relation is not gonna solve the problem entirely (because one term in the initial conditions is not known), but it does allow us to express $J_n$ as an expression in terms of a finite sum.
The recurrence relation can be rewritten. (for n>3)
$$J_n – J_{n1} = n (J_{n1} – J_{n2})1 $$
let $D_n = J_n – J_{n1}$
$$D_n = n D_{n1}1 $$
and with starting point $D_2 = 2x $ and we can write (note the recurrence relation is a bit different for $n = 2$ as @quester noted in the comments):
$$\begin{array}{rcrcrcrcrcr}
D_1 &=& 3 + 2\,x \\\\
D_2 &=& 2\,x\\
D_3 &=& \overbrace{6 \,x}^{\times 3} &&1\\
D_4 &=& \rlap{\overbrace{\hphantom{\begin{array}5040\,x&&7\cdot 6\cdot 5\cdot 4 \end{array}}}^{\times 4}} 24\,x&&4 && 1 \\
D_5&=& \rlap{\overbrace{\hphantom{\begin{array}5040\,x&&7\cdot 6\cdot 5\cdot 4 && 7\cdot 6\cdot 5 \end{array}}}^{ \times 5}} 120\,x&&5\cdot 4 && 5 && 1 \\\\
D_6&=& 720\,x&&6\cdot 5\cdot 4 && 6\cdot 5 && 6 & & 1 \\\\
D_7&=& 5040\,x&&7\cdot 6\cdot 5\cdot 4 && 7\cdot 6\cdot 5 && 7\cdot 6 & & 7 && 1 \\\\
D_k &=& k! x &&\rlap{\sum_{l=3}^{k} \frac{k!}{l!}} \\
\end{array}$$
and
$$ J_n = x \sum_{k=1}^n k! \sum_{k=3}^n\sum_{l=3}^{k} \frac{k!}{l!} $$
Closed form expression for $x$
Lets rewrite $D_k$
$$\begin{array}{} D_k &=& k! x – \sum_{l=3}^{k} \frac{k!}{l!}\\ &=& k! \left(x – \sum_{l=0}^k \frac{1!}{l!} – 2.5 \right)\end{array}$$
If we conjecture that $\lim_{k \to \infty }D_k$ is positive and finite then this leads to the requirement $\lim_{k \to \infty }\left(x – \sum_{l=0}^k \frac{1!}{l!} – 2.5 \right)= 0$ and
$$x = \lim_{k \to \infty } \left(\sum_{l=0}^k \frac{1!}{l!} – 2.5\right) = e2.5 \approx 0.2182818 $$
The argument that $\lim_{k \to \infty }D_k$ is finite is still a conjecture but it seems plausible to me.
Closed form expression for $D_k$
Filling in $x$ into the expression of $D_k$ will lead to:
$$\begin{array}{}
J_1 &=& D_1 & = & 2e2 \\
&&D_2 & = & 2e5 \\
&&D_k & = & k! \left( e – \sum_{l=0}^k \frac{1}{l!}\right) \\
\end{array}$$
Is $J_n$ finite?
We can argue that $J_n$ (the mean number of steps to reach the finish) is finite for any starting point $n$, because the mean position from the finish is decreasing to zero bounded by an exponential decay.

The mean distance from the finish: Say a frog starts in position $x$. Then after one jump the frog will be somewhere in position $0 \leq y \leq x+1$ (each option with probability $\frac{1}{x+2}$), and if $y \neq 0$ then after after two jumps the frog will be somewhere in position $0 \leq z \leq y+1$ (each option with probability $\frac{1}{y+2}$). Then the mean position $\bar{z}$ of a frog that started at $x$ and makes two jumps will be:
$$ \sum_{y=1}^{x+1}\sum_{z=1}^{y+1}\frac{z}{(x+2)(y+2)} = \frac{x^2+5x+4}{4x+8} \underbrace{\leq\frac{10}{12}x}_{\text{if $x\geq1$}}$$
So whatever the position of the frog, after two jumps, he will be on average at least 1/6 closer to the finish. 
Probability a frog is still in the game: Note that the probability of a frog still in the game relates to the mean distance of a frog in the game. The mean distance after $k$ jumps is $\mu_{\bar{x},k} \sum_{x=1}^\infty x f(x,k)$, where $f(x,k)$ is the probability of the frog to be in position $x$ after $k$ jumps. The probability of a frog to be still in the game is: $$ 1F_{(\text{finished at or before jump k})}(k)=\sum_{x=1}^\infty f(x,k) < \mu_{\bar{x},k} \leq x_{start} \left(\frac{10}{12} \right)^{k/2}$$.

Finiteness of $J_n$ The mean number of steps necessary can be found by $\sum_{k=0}^\infty k f(k)$ with $f(k)$ the probability that it takes $k$ steps. But you can also take $\sum_{k=0}^\infty 1F(x)$ with $F(k)$ the probability that it takes $k$ or less steps (note that the integral of the CDF is related to the mean or more generaly the expected value of any quantity is related to the quantile function). And since $1−F(k)$ is smaller than some decreasing exponential function of $k$, so must be the sum smaller than the integral/sum of that function and that is finite.
Starting with a simpler problem
With the recurrence relation $D_n = n D_{n1} – 1$ it is problematic to solve the case because the starting condition is not defined.
We can instead try to pose a simpler problem (suggested in the comments by @quester and @Hans). Let’s say that there are only $m+2$ leaves (instead of infinite), and thus the frog with only $m$ leaves in front of him will not be able to jump backwards. Then $J_m = J_{m1}$ (the frog in point $m$ has the same options as the frog in point $m1$) and we will have
$$D_{m} = m! \left(x_m – \sum_{l=0}^m \frac{1!}{l!} – 2.5 \right) = 0$$
which gives a solution for $x_{m}$ as:
$$x_m = \sum_{l=0}^m \frac{1!}{l!} – 2.5 $$
and the limit of $x_m$ as we start adding leaves is:
$$\lim_{m\to \infty} x_m = \lim_{m\to \infty} \sum_{l=0}^m \frac{1!}{l!} 2.5 = e2.5$$
Attribution
Source : Link , Question Author : Sextus Empiricus , Answer Author : Sextus Empiricus