Past Seminars

Printer-friendly version
  • Daqing Wan
    Thu May 17, 2018
    3:00 pm
    Given a global function field K of characteristic p>0, the fundamental arithmetic invariants include the genus, the class number, the p-rank and more generally the slope sequence of the zeta function of K. In this expository lecture, we explore possible stability of these invariants in a p-adic Lie tower of K. Strong stability is expected when...
  • Jeffrey Galkowski
    Thu May 17, 2018
    2:00 pm
    In this talk we relate concentration of Laplace eigenfunctions in position and momentum to sup-norms and submanifold averages. In particular, we present a unified picture for sup-norms and submanifold averages which characterizes the concentration of those eigenfunctions with maximal growth. We then exploit this characterization to derive...
  • Daniil Rudenko
    Wed May 16, 2018
    2:00 pm
    Classical polylogarithms have been studied extensively since pioneering work of Euler and Abel. It is known that they satisfy lots of functional equations, but in weight >4 these equations are not known yet. Even in the weight 4 they were first found using heavy computer-assisted computations.  The main goal of the talk is to explain...
  • Bao Wang
    Mon May 14, 2018
    4:00 pm
    First, I will present the Laplacian smoothing gradient descent proposed recently by Prof. Stan Osher. We show that when applied to a variety of machine learning models including softmax regression, convolutional neural nets, generative adversarial nets, and deep reinforcement learning, this very simple surrogate of gradient descent can...
  • Isaac Goldbring
    Mon May 14, 2018
    4:00 pm
    In this first of two talks, I will explain the notion of definability in continuous logic and connect it with the notion of spectral gap in the theory of unitary representations and in ergodic theory.
  • Michael Porter
    Mon May 14, 2018
    2:30 pm
  • Zhiqin Lu
    Fri May 11, 2018
    4:00 pm
    In this talk, we shall give a mathematical setting of the Random Backpropogation  (RBP) method in unsupervised machine learning. When there is no hidden layer in the neural network, the method degenerates to the usual least square method. When there are multiple hidden layers, we can formulate the learning procedure as a system of nonlinear...