In this space I post the most common and recurrent questions of
the week. It can be used as additional material you can use to further
and test your understanding.
Question
We know that $P(\emptyset)=0$ for any probability $P$. Can we say
that $E=\emptyset$ if $P(E)=0$?
While the answer is sometimes yes, like for example in a finite sample
space where all outcomes have positive probability (like in the case
equiprobable outcomes), the answer is no in general. To see this,
consider flipping a fair coin repeatedly. Consider the event $\omega$
of the coin always landing on $H$ each time, i.e
$\omega=(H,H,H,H,\dots)$. We show that $P(\omega)=P(\{ \omega\})=0$,
while, of course, $\{ \omega \}\neq\emptyset$. Notice that, instead of
$\omega$, we could have chosen any specific sequence of heads and
tails. Define the events
$$
E_n=\big\{ (F_1,F_2,\dots)\in \{ H,T \}^\mathbb{N}\, \big |\,
F_1=F_2=\dots=F_n=H\big \}
$$
for $n\in \mathbb{N}$ and observe that $P(E_n)=\frac{1}{2^n}$. It then
follows that
$$
P(\{ \omega \})=P \bigl( \bigcap _{n=1}^\infty E_n\bigr)=
\lim _{n\to\infty}P(E_n)=\lim_{n\to\infty}\frac{1}{2^n}=0
$$
since $E_{n+1}\subset E_n$ for $n\geq 1$.
Question
Why does it hold that $\frac{1}{k}\sum_{i=0}^k\bigl(\frac{i}{k}\bigr)^m\simeq
\frac{1}{m+1}$?
The integral of a continuous function $f:[0,1]\to \mathbb{R}$ can be
computed as a limit of Riemann sums. In particular, if you partition the
interval $[0,1]$ into subintervals
$$
J_i=\bigl[\frac{i-1}{k},\frac{i}{k}\bigr),\: i=1,\dots,k
$$
and choose $x_i\in J_i$, then
$$
\int _0^1 f(x)\ dx=\lim _{k\to\infty} \sum _{i=1}^k f(x_i)\frac{1}{k},
$$
where, of course, $ \frac{1}{k}=\frac{i}{k}-\frac{i-1}{k}$ is the
length of $J_i$. Now simply choose $f(x)=x^m$ to see that
$$
\frac{1}{m+1}=\frac{1}{m+1}x^{m+1}\Big |_0^1=\int _0^1 x^m\ dx=
\lim _{k\to\infty} \sum _{i=1}^k x_i^m\frac{1}{k}=
\lim _{k\to\infty} \sum _{i=1}^k \bigl( \frac{i}{k}\bigr)
^m\frac{1}{k}=
\lim _{k\to\infty} \frac{1}{k}\sum _{i=0}^k \bigl( \frac{i}{k}\bigr) ^m,
$$
where we used $x_i=\frac{i}{k}$, $i=0,\dots, k$.
Question
In an infinite sequence of experiments with success probability $p\in (0,1)$ and failure
probability $q=1-p$, what is the probability of observing a run of $n$
consecutive successes before a run of $m$ consecutive failures?
We will use two decompositions of the sample space by the events $F_j$
and $S_j$ that the first failure or first success occurs in the $j$th
experiment
$$
S=\overset{.}{\bigcup}_{j=1}^\infty F_j,\:
S=\overset{.}{\bigcup}_{j=1}^\infty S_j ,
$$
where the dot indicates that the sets are pairwise disjoint. Denoting
by $E$ the set of interest, we see that
\begin{align*}
P(E)&=P \Bigl( E\cap \overset{.}{\bigcup}_{j=1}^\infty F_j\Bigr) =
P \Bigl( \overset{.}{\bigcup}_{j=1}^\infty (E\cap F_j)\Bigr) =
\sum _{j=1}^\infty P(E\cap F_j)=\sum _{j=1}^\infty P(E|F_j)P(F_j)\\
P(E)&=P \Bigl( E\cap \overset{.}{\bigcup}_{j=1}^\infty S_j\Bigr) =
P \Bigl( \overset{.}{\bigcup}_{j=1}^\infty (E\cap S_j)\Bigr) =
\sum _{j=1}^\infty P(E\cap S_j)=\sum _{j=1}^\infty P(E|S_j)P(S_j).
\end{align*}
Observe now that $P(E|F_j)=1$ for $j\geq n+1$ since a run of $n$
successes occurs before a run of $m$ failures, if the first failure
occurs after at least $n+1$ experiments. Similarly $P(E|S_j)=0$ for
$j\geq m$ as this means that the first $j$ experiments all yielded
failures. It follows that
\begin{align*}
P(E)&=\sum _{j=1}^n P(E|F_j)P(F_j)+\sum_{j=n+1}^\infty P(F_j)\\
P(E)&=\sum _{j=1}^m P(E|S_j)P(S_j).
\end{align*}
It holds that $P(F_j)=p^{j-1}(1-p)$ and that
$P(S_j)=(1-p)^{j-1}p=q^{j-1}p$. Moreover, denoting by $S$ the event of starting
with a success, it holds that
$P(E|F_j)=P(E|S^\mathsf{c})$ and that $P(E|S_j)=P(E|S)$, since the
occurrence of the first failure essentially amounts to restarting with
a failure and the occurrence of the first success amounts to
restarting with a success. Summarizing
\begin{align*}
P(E)&=\sum _{j=1}^nP(E|S^\mathsf{c})p^{j-1}(1-p)+
\sum_{j=n+1}^\infty p^{j-1}(1-p)=
(1-p)\frac{1-p^n}{1-p}P(E|S^\mathsf{c})+p^n=
(1-p^n)P(E|S^\mathsf{c})+p^n\\
P(E)&=\sum _{j=1}^m P(E|S)q^{j-1}(1-q)=(1-q^m)P(E|S).
\end{align*}
Finally, using Bayes' formula, we obtain
$$
P(E)=P(E|S)P(S)+P(E|S^\mathsf{c})
P(S^\mathsf{c})=p\frac{P(E)}{1-q^m}+q\frac{P(E)-p^n}{1-p^n}
$$
and solving for $P(E)$ we arrive at
$$
P(E)=\frac{(1-q^m)p^{n-1}}{p^{n-1}+q^{m-1}-p^{n-1}q^{m-1}}.
$$
Question
Flip a coin three times. Assume that the coin is biased with the
probability $P(H)$ of flipping heads amounting to 0.3 and determine
the expected number of heads observed.