Some Questions about Groups Coming from Information Theory

Printer-friendly version
Speaker: 
Michael O'Sullivan
Institution: 
San Diego State University
Time: 
Mon, 04/24/2017 - 3:00pm - 4:00pm
Host: 
Nathan Kaplan
Location: 
RH 440R

For each  random n-vector there is an entropy vector of length 2^n-1.  A fundamental question in information theory is to characterize the region formed by these  entropic vectors. The region is bounded by Shannon's inequalities, but not tightly bounded for n>3. Chan and Yeung discovered that random vectors constructed from groups fill out the entropic region, so that information theoretic properties may be interpreted to give properties of groups and combinatorial results for groups may be used to better understand the entropic region.  I will elaborate on these connections and present some simple and interesting questions about groups that arise.