## Speaker:

## Institution:

## Time:

## Host:

## Location:

Consider an unknown random vector $X$ that takes values in $R^d$. Is it possible to "guess" its mean accurately if the only information one is given consists of $N$ independent copies of $X$? More accurately, given an arbitrary norm on $R^d$, the goal is to find a mean estimation procedure upon receiving a wanted confidence parameter $\delta$ and $N$ independent copies $X_1,\cdots,X_N$ of an unknown random vector $X$ (that has a finite mean $\mu$ and finite covariance) the procedure returns $\hat{\,\,\mu}$ for which the norm of the error $\hat{\,\mu} - \mu$ is as small as possible, with high probability $1-\delta$.

This mean estimation problem has been studied extensively over the years. I will present some of the ideas that have led to its solution. An obvious choice is to set $\hat{\,\mu}$ to be the empirical mean, i.e. the arithmetic mean of the sample vectors $X_i$. Surprisingly, the empirical mean is a terrible option for small confidence parameters $\delta$ -- most notably, when X is "heavy-tailed". What is even more surprising is that one can find an optimal estimation procedure that performs as if the (arbitrary) random vector X were Gaussian. (A joint work with G. Lugosi)