Conditional Probability.

An important, if not the most important, concept in probability is that of conditional probability. It allows one to "reduce the randomness" based on available knowledge. A very simple but illustrative example is the following: the probability of rolling 5 on a fair die is $\frac{1}{6}$, but if we knew that an odd number had been rolled, the probability would be reassessed to $\frac{1}{3}$. The second probability is conditioned by the knowledge that an odd number has been rolled, which is itself an event in the sample space. This type of examples are the intuitive basis for the defintion of the conditional probability $P(E|F)$ of an event E given an event by F $$ P(E|F)=\frac{P(E\cap F)}{P(F)} $$ provided $P(F)>0$.

Lecture 7

The concept of conditional probability is introduced and a number of examples are given to flesh the definition out. It is also shown that $$ P(E_1\cap E_2\cap\cdots\cap E_n)=P(E_1)\, P(E_2|E_1)\, P(E_3|E_1\cap E_2)\,\cdots\, P(E_n|E_1\cap\cdots\cap E_{n-1}) $$ for any finite number of events $E_1,\dots,E_n$.

Lecture 8

More illustrative examples are discussed and the validity of the important Bayes' Formula $$ P(E)=P(E|F)P(F)+P(E|F^\mathsf{c})P(F^\mathsf{c}) $$ is proved for any choice of events $E,F\in S$ with $P(F)>0$ and $P(F^\mathsf{c})>0$. As a matter of fact the formula remains valid if any of $P(F)$, $P(F^\mathsf{c})$ vanishes with the understanding that the corresponding term has to be omitted from the sum.

Lecture 9

More examples are presented intertwined with the concept of the odds of an event $E$ given by $$ \frac{P(E)}{P(E^\mathsf{c})} $$ and with the generalized Bayes' Formula $$ P(E)=P(E|F_1)P(F_1)+P(E|F_2)P(F_2)+\dots+P(E|F_n)P(F_n) $$ for pairwise disjoint event $F_1,F_2,\dots,F_n$ with $S=F_1\cup F_2\cup \cdots\cup F_n$. A consequence of Bayes' formula is also considered.