1. Hitting time
Let a sequence of random variables $X_0, X_1, \ldots \sim \text{Markov}(\lambda, P)$ and suppose $X_n: (\Sigma, \mathcal F, \mathbb P) \to (E, \mathcal E)$ for each $n \in \{0, 1, \ldots \}$. A hitting time of $X_n$ on a set $A \subseteq E$ is defined by:
Also, $H_i^A$ for some $i \in E$ denotes hitting time on $A$ given $X_0 = i$. For the rest of this note, the subscript denotes the initial or given condition unless otherwise noted.
1.1. Probability of Hit
Consider the probability that $H_i^A$ is finite, $h_i^A = \mathbb Pi(H^A < ∞)$. The following result is true.
intuitively, $h_i^A$ 表示hit的可能性
$Proof.$ We first prove the first statement, ${i.e.}$, the finite time hitting time probabilities satisfy the given equations. If $i \in A$, the equations are trivially satisfied. Since $i \in A$ implies
Consider $i \notin A$. Then, at least one step is required to reach $A$, so we can consider marginal probability on the value of $X_1$ to get
对于某一个状态 $i$, 只有当它的每一个后序状态 $j$ 都一定可 hit 时, $i$ 才可一定可 hit
Next, we prove that the second statement, $i.e.$, $(h_i^A)_{i∈I}$ is the smallest non-negative solution to the equation above. Note that, if $i \in A$, then $h_i^A = 1 = x_i$ . All solutions ($x_i ’s$) are the same.
Consider $i \notin A$. Then,
We have shown that any non-negative solution $x_i$ is lower bounded by $\mathbb P_i(H_i^A ⩽ n)$, for all $n$. Therefore,
It can be noted that $x_i ≡ 1$ is a solution to $(∗)$, which may not be the minimal solution since $h_i^A$ is a probability thus $h_i^A ⩽ 1$. This solution guarantees the existence of the minimal solution. Also, the system of equations $(∗)$ can be expressed using a projection operator $\Pi_A$, which is element-wise multiplication of the indicator function.
This formulation is related to discrete harmonic analysis, and the set of equations can be considered a discrete version of the $\text{Laplace}$’s equation with a given boundary condition on $A$.
1.2. Gambler’s Ruin Problem
A gambler plays a game. He has $k$ bucks in his wallet when he started, and every round, he may win or lose 1 dollar. He will play the game until he has $N$ dollars in his wallet— he will be satisfied enough or lose all his money. We are interested in if he will go bankrupt. Let $X_n$ be the budget after $n$ games, and suppose the chance of winning is given by a fixed constant $p \in (0, 1)$. Then $X_n$ is a Markov chain.
$X_n$ can take value from $I = \{0, 1, \ldots , N\}$, The initial distribution is $X_0 = k$. The transition matrix is:
$0$ and $N$ are both absorbing state, $i.e.$, $p_{00} = p_{NN} = 1$. The event when the gambler loses all the money can be expressed by
$A = \{0\}$. Then the probability of going bankrupt is,
Since $0 ∈ A, h_0 = 1$. While $N$ is an absorbing state, the state can never reach $0$ if started from $N$. Therefore, $h_N = 0$. For $i \in \{1, 2, · · · , N − 1\}$,