Variance and Common Distributions.

The discussion of (discrete) random variables is concluded with the definition of their variance (and of higher moments). We then start looking at specific distributions, that is, random variables characterized by specific probability density functions.

Lecture 16

It is shown that, given a discrete random variable $X$ and a function $g:X(S)\to\mathbb{R}$, it is possible to compute the expected value of the random variable $g\circ X$ by $$ E(g\circ X)=E \bigl( g(X)\bigr) =\sum _{k\in \mathbb{N}}g(x_k)\, p_X(x_k), $$ where you recognize the formula for the expectation of $X$ if you choose $g$ to be the identity. The concepts of variance and standard deviation are defined. They capture the expected (average) deviation from the mean (average, expectation) of a random variable.

Lecture 17

We make the acquaintance with the Bernoulli and the Binomial distributions, look at examples, and learn some of their properties. The binomial distribution typically captures the number of successes in a series of $n\in \mathbb{N}$ independent Bernoulli experiments. The latter are simple experiments with two outcomes: success or failure with probabilities $p\in[0,1]$ and $1-p$ respectively.

Lecture 18

More examples and more properties of the binomial distribution are considered. The lecture concludes with the very useful Stirling's (Approximation) Formula $$ n!\approx\sqrt{2\pi}\, n^{n+1/2}e^{-n}\text{ for $n$ large,} $$ that gives an approximation of the factorial function.