Common Continuous Distributions.

Lecture 22

In this video the important representation formula $$ E(Y)=\int _0^\infty P(Y>y)\, dy-\int _{-\infty}^0 P(Y<-y)\, dy $$ is derived connecting the expected value of a continuous random variable $Y$ to its cumulative distribution function. Making use of it, the proof of $$ E \bigl( g(X)\bigr)=\int _{-\infty} ^\infty g(x)\, f_X(x)\, dx $$ is also presented for a continuous random variable $X$ and a (continuous) function $g:\mathbb{R}\to \mathbb{R}$. Do you remember how the corresponding formula works for a discrete random variable?

Lecture 23

The uniform and the normal distribution are introduced. The latter is ubiquitous in probability and applications in many sciences. It is characterized by its cumulative distribution function $$ N(\mu,\sigma^2)=\frac{1}{\sigma\sqrt{2\pi}} e^{-\frac{(x-\mu)^2}{2 \sigma^2}} $$ where $\mu$ is its mean and $\sigma ^2$ its variance. The video ends with a couple of examples.

Lecture 24

The De Moivre-Laplace formula connects the binomial distribution to the normal distribution through a limiting process. It is a special case of the very important central limit theorem, where similar convergence to a normal distribution is obtained for a more general sum of independent random variables. In the video a binomially distributed random variable $S_n\sim B(n,p)$ is used. It can be obtained as the sum of $n$ independent Bernoulli random variables $X_k\sim B(1,p)$, $k=1,\dots, n$. The exponential distribution is introduced next. Its properties are studied, in particular its memoryless nature, and examples are given. It is characterized by the probability density function $$ N(\lambda)(x)=\begin{cases} 0,&x<0,\\ \lambda e^{-\lambda x},&x\geq 0.\end{cases} $$