Conditional Expectation : Exercises
Introduction
Example problems
Click on the problems to reveal the solution
Problem 1
We can calculate $\mathbb{E}(X_n)$ by conditioning on the value of $X_{n-1}$. In other words we can use the conditional expectation theorem to write that: $$ \mathbb{E}(X_n) = \sum_{k=n}^\infty \mathbb{E}(X_n|X_{n-1}=k)P(X_{n-1}=k) $$ To calculate the conditional conditional expectation value, $\mathbb{E}(X_n|X_{n-1}=k)$ in this expression we use the conditional expectation theorem once more. When doing this we condition on the outcome of the $(k+1)$th game of backgammon. In the following equations the outcome of this $(k+1)$th game is indicated using the random variable $Y$: \[ \begin{aligned} \mathbb{E}(X_n|X_{n-1} = k) & = \mathbb{E}(X_n| X_{n-1}=k \wedge Y=1)P(Y=1) + \mathbb{E}(X_n|X_{n-1}=k \wedge Y=0)P(Y=0) \\ & = ( k + 1)p + (k + 1 + \mathbb{E}(X_n) )(1-p) \\ & = k + 1 + (1-p)\mathbb{E}(X_n) \end{aligned} \] In the above solution we determine that $\mathbb{E}(X_n| X_{n-1}=k \wedge Y=1)=k+1$ by noting that if Niall has won the last $n-1$ games and if he wins the next game (the $(k+1)$th game in the sequence) he will have the required $n$-long streak of wins. We determine that $\mathbb{E}(X_n| X_{n-1}=k \wedge Y=0)=k + 1 + \mathbb{E}(X_n)$ by noting that Nial's streak of $n$ victories will not have occured if he looses a game ($Y=0$) immediately after winning $n-1$ consecutive games. His streak of wins will be too short by one in this case. We further note that the outcome of each individual game is independent and hence that we must wait a random number of games, $X_n$, before this winning streak occurs. In this case we will thus will require $k+1+X_n$ games to be played in total. When we take the expectation of this random quantity and exploit the linearity of this operator we get the result $\mathbb{E}(X_n| X_{n-1}=k \wedge Y=0)=k + 1 + \mathbb{E}(X_n)$. We can now insert the expression we obtained for $\mathbb{E}(X_n|X_{n-1} = k)$ into the expression for $\mathbb{E}(X_n)$ above and rearrange to get the desired answer as shown below: \[ \begin{aligned} \mathbb{E}(X_n) & = \sum_{k=n}^\infty \mathbb{E}(X_n|X_{n-1}=k)P(X_{n-1}=k) \\ & = \sum_{k=n}^\infty ( k + 1 + (1-p)\mathbb{E}(X_n) )P(X_{n-1}=k) \\ & = \sum_{k=n}^\infty k P(X_{n-1}=k) + (1+ (1-p)\mathbb{E}(X_n) )\sum_{k=n}^\infty P(X_{n-1}=k) \\ & = \mathbb{E}(X_{n-1}) + 1 + (1-p)\mathbb{E}(X_n) \\ \rightarrow \qquad \mathbb{E}(X_n) & = \frac{1}{p}[ 1 + \mathbb{E}(X_{n-1}) ] \end{aligned} \] To solve this we can use the fact that: $\mathbb{E}(X_1) = \frac{1}{p}$ as $X_1$ is a geometric random variable. We can therefore get to: \[ \mathbb{E}(X_2) = \frac{1}{p}[ 1 + \mathbb{E}(X_{1})] = \frac{1}{p}\left( 1 + \frac{1}{p} \right) = \frac{1}{p} + \frac{1}{p^2} = \sum_{i=1}^2 \frac{1}{p^i} \] We can then do: \[ \mathbb{E}(X_3) = \frac{1}{p}( 1 + \mathbb{E}(X_2) ) = \frac{1}{p}\left( 1 + \sum_{i=1}^2 \frac{1}{p^i}\right) = \sum_{i=1}^3 \frac{1}{p^i} \] This pattern clearly continues and the result thus follows
Problem 2
Problem 3
Problem 4
Problem 5
Choice sequence | Total time | Probability |
3 | 1 | $\frac{1}{3}$ |
1 3 | 2 + 1 = 3 | $\left(\frac{1}{3}\right)\left(\frac{1}{2}\right)= \frac{1}{6}$ |
2 3 | 3 + 1 = 4 | $\left(\frac{1}{3}\right)\left(\frac{1}{2}\right)=\frac{1}{6}$ |
1 2 3 | 2 + 3 + 1 = 6 | $\left(\frac{1}{3}\right)\left(\frac{1}{2}\right)(1)= \frac{1}{6}$ |
2 1 3 | 3 + 2 + 1 = 6 | $\left(\frac{1}{3}\right)\left(\frac{1}{2}\right)(1)= \frac{1}{6}$ |