|
4:00pm to 5:00pm - RH 306 - Applied and Computational Mathematics Jun-Kun Wang - (UCSD) Optimization and Sampling via Hamiltonian Mechanics The laws of classical mechanics, which describe the dynamics of moving bodies in physical space, have been a subject of great practical interest and inspired centuries of deep mathematical study. In particular, the perspective that describes the dynamics of a particle as a Hamiltonian flow in phase space has found useful applications in designing algorithms such as Hamiltonian Monte Carlo (HMC) in sampling. In this talk, I will describe a scheme of time-varying integration time for HMC so that it can have a provable acceleration, compared to using a fixed constant integration time, for sampling from a class of distributions. I will then switch to optimization and introduce an optimization algorithm called Hamiltonian Descent (HD), which can be viewed as a direct counterpart of HMC in sampling. When solving strongly convex quadratic problems, HD exhibits a novel update scheme that involves matrix-power-vector products. I will also cover Coordinate Hamiltonian Descent and its parallelizable variant, which turns out to encapsulate the classical Gauss-Seidel method, Successive Over-relaxation, Jacobi method, and more, for solving a linear system of equations. The talk will be based on the following two papers. Joint work with Andre Wibisono (Yale). |
|
3:00pm to 4:00pm - 440R - Machine Learning Nhat Thanh Van Tran - (UCI) Applying Transformer to Time Series: A Variate Focused Approach Abstract: In this talk, I will present some new developments of transformers in the time series. In addition, I will introduce the state space model, in particular Mamba and its application in time series if time allows. |
|
2:00pm - 510R Rowland Hall - Combinatorics and Probability Jorge Garza Vargas - (Caltech) A new proof of Friedman's second eigenvalue theorem with strong implications In 2004 J. Friedman wrote a ~100 page paper proving a conjecture of Alon which stated that random d-regular graphs are nearly optimal expanders. Since then, Friedman's result has been refined and generalized in several directions, perhaps most notably by Bordenave and Collins who in 2019 established strong convergence of independent permutation matrices (a massive generalization of Friedman's theorem), a result that led to groundbreaking results in spectral theory and geometry. In this talk I will present joint work with C. Chen, J. Tropp and R. van Handel, where we introduce a new proof technique that allows one to convert qualitative results in random matrix theory into quantitative ones. This technique yields a fundamentally new approach to the study of strong convergence which is more flexible and significantly simpler than the existing techniques. Concretely, we're able to obtain (1) a remarkably short of Friedman's theorem (2) a quantitative version of the result of Bordenave and Collins (3) a proof of strong convergence for arbitrary stable representations of the symmetric group, which constitutes a substantial generalization of the result of Bordenave and Collins. |
