Speaker: 

Professor Max Welling

Institution: 

UC Irvine

Time: 

Friday, October 16, 2009 - 2:00pm

Location: 

RH 440R

Jayne's maximum entropy principle is a widely used method for learning probabilistic models of data. Learning the parameters of such models is computationally intractable for most problems of interest in machine learning. As a result one has to resort to severe approximations. However, by "appropriately tweaking" the standard learning rules, one can define a nonlinear dynamical system without fixed points or even periodic orbits.This system is related to a family of weakly chaotic systems known as "piecewise isometries" which have vanishing topological entropy. The symbolic sequences of the very simplest 1 dimensional system areequivalent to Sturmian sequences. The averages over the symbolic sequences of many coupled variables can be shown to capture the relevant correlations present in the data. In this sense, we use this system to learn from data and make new predictions.