In this space I post the most common and recurrent questions of
the week. It can be used as additional material you can use to further
and test your understanding.
Question
In an infinite sequence of experiments with success probability $p\in (0,1)$ and failure
probability $q=1-p$, what is the probability of observing a run of $n$
consecutive successes before a run of $m$ consecutive failures?
We will use two decompositions of the sample space by the events $F_j$
and $S_j$ that the first failure or first success occurs in the $j$th
experiment
$$
S=\overset{.}{\bigcup}_{j=1}^\infty F_j,\:
S=\overset{.}{\bigcup}_{j=1}^\infty S_j ,
$$
where the dot indicates that the sets are pairwise disjoint. Denoting
by $E$ the set of interest, we see that
\begin{align*}
P(E)&=P \Bigl( E\cap \overset{.}{\bigcup}_{j=1}^\infty F_j\Bigr) =
P \Bigl( \overset{.}{\bigcup}_{j=1}^\infty (E\cap F_j)\Bigr) =
\sum _{j=1}^\infty P(E\cap F_j)=\sum _{j=1}^\infty P(E|F_j)P(F_j)\\
P(E)&=P \Bigl( E\cap \overset{.}{\bigcup}_{j=1}^\infty S_j\Bigr) =
P \Bigl( \overset{.}{\bigcup}_{j=1}^\infty (E\cap S_j)\Bigr) =
\sum _{j=1}^\infty P(E\cap S_j)=\sum _{j=1}^\infty P(E|S_j)P(S_j).
\end{align*}
Observe now that $P(E|F_j)=1$ for $j\geq n+1$ since a run of $n$
successes occurs before a run of $m$ failures, if the first failure
occurs after at least $n+1$ experiments. Similarly $P(E|S_j)=0$ for
$j\geq m$ as this means that the first $j$ experiments all yielded
failures. It follows that
\begin{align*}
P(E)&=\sum _{j=1}^n P(E|F_j)P(F_j)+\sum_{j=n+1}^\infty P(F_j)\\
P(E)&=\sum _{j=1}^m P(E|S_j)P(S_j).
\end{align*}
It holds that $P(F_j)=p^{j-1}(1-p)$ and that
$P(S_j)=(1-p)^{j-1}p=q^{j-1}p$. Moreover, denoting by $S$ the event of starting
with a success, it holds that
$P(E|F_j)=P(E|S^\mathsf{c})$ and that $P(E|S_j)=P(E|S)$, since the
occurrence of the first failure essentially amounts to restarting with
a failure and the occurrence of the first success amounts to
restarting with a success. Summarizing
\begin{align*}
P(E)&=\sum _{j=1}^nP(E|S^\mathsf{c})p^{j-1}(1-p)+
\sum_{j=n+1}^\infty p^{j-1}(1-p)=
(1-p)\frac{1-p^n}{1-p}P(E|S^\mathsf{c})+p^n=
(1-p^n)P(E|S^\mathsf{c})+p^n\\
P(E)&=\sum _{j=1}^m P(E|S)q^{j-1}(1-q)=(1-q^m)P(E|S).
\end{align*}
Finally, using Bayes' formula, we obtain
$$
P(E)=P(E|S)P(S)+P(E|S^\mathsf{c})
P(S^\mathsf{c})=p\frac{P(E)}{1-q^m}+q\frac{P(E)-p^n}{1-p^n}
$$
and solving for $P(E)$ we arrive at
$$
P(E)=\frac{(1-q^m)p^{n-1}}{p^{n-1}+q^{m-1}-p^{n-1}q^{m-1}}.
$$
Question
Flip a coin three times. Assume that the coin is biased with the
probability $P(H)$ of flipping heads amounting to 0.3 and determine
the expected number of heads observed.