Conditional Divergence

Speaker: 

Liam Hardiman

Institution: 

UCI

Time: 

Tuesday, November 12, 2019 - 2:00pm to 3:00pm

Location: 

RH 510R

This week, we will continue to discuss Section 2.2 of the lecture notes of Wu and Polyanski: 
http://people.lids.mit.edu/yp/homepage/papers.html

Working Group in Information Theory is a self-educational project in the department. Techniques based on information theory have become essential in high-dimensional probability, theoretical computer science and statistical learning theory. On the other hand, information theory is not taught systematically. The goal of this group is to close this gap.

Conditional Divergence

Speaker: 

Liam Hardiman

Institution: 

UC Irvine

Time: 

Tuesday, November 5, 2019 - 2:00pm to 3:00pm

Location: 

RH 510R

This week, we will continue to discuss Section 2.1-2.2 of the lecture notes of Wu and Polyanski: 
http://people.lids.mit.edu/yp/homepage/papers.html

Working Group in Information Theory is a self-educational project in the department. Techniques based on information theory have become essential in high-dimensional probability, theoretical computer science and statistical learning theory. On the other hand, information theory is not taught systematically. The goal of this group is to close this gap.

Pinsker’s Inequality and Differential Entropy

Speaker: 

John Peca-Medlin

Institution: 

UCI

Time: 

Tuesday, October 29, 2019 - 2:00pm

Location: 

RH 510R

This week, we will continue to discuss Section 1.6 and 1.7 of the lecture notes of Wu and Polyanski: 
http://people.lids.mit.edu/yp/homepage/papers.html

Working Group in Information Theory is a self-educational project in the department. Techniques based on information theory have become essential in high-dimensional probability, theoretical computer science and statistical learning theory. On the other hand, information theory is not taught systematically. The goal of this group is to close this gap.

Information divergence and differential entropy

Speaker: 

John Peca-Medlin

Institution: 

UCI

Time: 

Tuesday, October 22, 2019 - 2:00pm to 3:00pm

Location: 

510R

This week, we will discuss Section 1.6 and 1.7 of the lecture notes of Wu and Polyanski: 
http://people.lids.mit.edu/yp/homepage/papers.html

Working Group in Information Theory is a self-educational project in the department. Techniques based on information theory have become essential in high-dimensional probability, theoretical computer science and statistical learning theory. On the other hand, information theory is not taught systematically. The goal of this group is to close this gap.

 Submodularity of Entropy, Han's Inequality, and Shearer’s Lemma

Speaker: 

Kathryn Dover

Institution: 

UCI

Time: 

Tuesday, October 15, 2019 - 2:00pm to 3:00pm

Location: 

510R

Working Group in Information Theory is a self-educational project in the department. Techniques based on information theory have become essential in high-dimensional probability, theoretical computer science and statistical learning theory. On the other hand, information theory is not taught systematically. The goal of this group is to close this gap.

This week, we will discuss Section 1.4 and 1.5 of the lecture notes of Wu and Polyanski: 
http://people.lids.mit.edu/yp/homepage/papers.html

 

Welcome to the Working Group in Information Theory

Speaker: 

Roman Vershynin

Institution: 

UCI

Time: 

Tuesday, October 8, 2019 - 2:00pm to 3:00pm

Location: 

RH 510M

Working Group in Information Theory is a new self-educational project in the department. Techniques based on information theory have become essential in high-dimensional probability, theoretical computer science and statistical learning theory. On the other hand, information theory is not taught systematically. The goal of this group is to close this gap. We will meet weekly and take turns to present the materital from the lecture notes of Wu and Polyanski: 

http://people.lids.mit.edu/yp/homepage/papers.html

Everyone, including especially graduate students, is welcome to participate!

I will start presenting Section 1.

Random tensors are well conditioned

Speaker: 

Roman Vershynin

Institution: 

UCI

Time: 

Thursday, February 28, 2019 - 12:00pm

Location: 

RH 340N

In contrast to random matrix theory, the theory of random tensors is poorly understood. This is surprising given the popularity of tensor methods in modern algorithmic applications. I will prove that tensor powers of independent random vectors are well conditioned with high probability. The argument uses several tools of high-dimensional probability as well as the Restricted Invertibility Principle of Bourgain and Tzafriri.

An efficient net via random rounding

Speaker: 

Galyna Livshyts

Institution: 

Georgia Tech

Time: 

Thursday, March 7, 2019 - 12:00pm

Host: 

Location: 

RH 340N

We prove the “net theorem” discussed in my previous talk in the Probability seminar. The “lite version” of the theorem states the existence of a net around the sphere of cardinality 100^n, such that for every random matrix A with independent columns, with probability 1-5^{-n}, the values of |Ax| on the sphere can be compared to its values on the net. The error in this comparison is optimal, and the formulation of the theorem is scale-invariant. We emphasize that the only assumption required for this is the independence of columns. The proof consists of four steps, and shall be outlined.

Pages

Subscribe to RSS - Working Group in Information Theory