|
2:00pm to 5:45pm - ISEB 1010 - Differential Geometry - (Special Seminar) Special Seminar in Geometric Analysis Talk Schedule (each talk 30-35 minutes) 2:00 PM Alex Mramor (University of Copenhagen) 2:45 PM Kai-Wei Zhao (Notre Dame) 3:30 PM Hongyi Sheng (UC San Diego) 4:15 PM Tin Yau Tsang (New York University) 5:00 PM Xiaolong Li (Wichita State University)
Titles/Abstracts Speaker: Alex Mramor (University of Copenhagen)
Speaker: Kai-Wei Zhao (Notre Dame)
Speaker: Hongyi Sheng (UC San Diego)
Speaker: Tin Yau Tsang (New York University)
Speaker: Xiaolong Li (Wichita State University) |
|
3:00pm to 4:00pm - RH 440R - Machine Learning Eric Liu - (SDSU and UCI ) Enhancing Model Efficiency: Applications of Tensor Train Decomposition in Machine Learning The application of Tensor Train (TT) decomposition in machine learning models provides a promising approach to addressing challenges related to model size and computational complexity. TT decomposition, by breaking down high-dimensional weight tensors into smaller, more manageable tensor cores, allows for significant reductions in model size while maintaining performance. This presentation will explore how TT decomposition can be effectively used in different types of models. TT decomposition is adopted differently in recurrent models, Convolutional Neural Networks (CNN), and Binary Neural Networks (BNN). In recurrent models like Long Short-Term Memory (LSTM), large weight matrices are transformed into smaller, manageable tensor cores, reducing the number of parameters and computational load. For CNNs, TT decomposition targets the convolutional layers, transforming convolutional filters into tensor cores to preserve spatial structure while significantly reducing parameters. In BNNs, TT decomposition is combined with weight binarization, resulting in extremely compact models that retain essential information for accurate predictions even with minimal computational power and memory. The primary aim of this presentation is to explore the theoretical foundations and practical applications of TT decomposition, demonstrating how this technique optimizes various machine learning models. The findings suggest that TT decomposition can greatly enhance model efficiency and scalability, making it a valuable tool for a wide range of applications. |
|
2:00pm to 3:00pm - 510R Rowland Hall - Combinatorics and Probability Dylan King - (Caltech) Online Ordered Ramsey Numbers The Ramsey number of graphs G1 and G2, the smallest N so that any red/blue coloring of the N-clique contains either a red G1 or a blue G2, is one of the most studied notions in combinatorics. We study a related process called the online ordered Ramsey game, played between two players, Builder and Painter. Builder has two graphs, G1 and G2, each of which has a specific ordering on its vertices. Builder starts with an edgeless graph on an ordered vertex set (the integers) and attempts to build either an ordered red copy of G1 or an ordered blue copy of G2 by adding one edge at a time. When Builder adds an edge, Painter is required to decide, at the time of creation, whether an edge is red or blue. Ramsey’s Theorem tells us that Builder can eventually win; their objective is to do so using the minimum number of turns, and Painter’s objective is to delay them as long as possible. The *online ordered Ramsey number* of G1 and G2 is the number of turns taken when both players play optimally. Online ordered Ramsey numbers were introduced by Perez-Gimenez, P. Pralat, and West in 2021. In this talk we will discuss their relation to other types of Ramsey numbers and present some results on the case when at least one of G1,G2 is sparse. (Joint work with Emily Heath, Grace McCourt, Hannah Sheats, and Justin Wisby) |
|
9:00am to 9:50am - Zoom - Inverse Problems Tony Liimatainen - (University of Helsinki) The general Calderón problem on Riemannian surfaces and inverse problems for minimal surfaces |
|
1:00pm to 1:50pm - RH 510R - Algebra Ariel Rosenfield - (UCI) Enriched Grothendieck topologies under change of base In the presence of a monoidal right adjoint G : V -> U between locally finitely presentable symmetric monoidal categories, we examine the behavior of V-Grothendieck topologies on a V-category C, and that of their constituent covering sieves, under the change of enriching category induced by G. We prove in particular that when G is faithful and conservative, any V-Grothendieck topology on C corresponds uniquely to a U-Grothendieck topology on G_*C, and that when G is fully faithful, base change commutes with enriched sheafification in the sense of Borceux-Quinteiro. |
|
4:00pm to 5:00pm - RH 340P - Applied and Computational Mathematics Linh Huynh - (Dartmouth) Inference on Large Random Networks: A Statistical Mechanics Perspective via a Modern Hopfield Network Model In this talk, I will discuss one of my research themes on inferring microscopic mechanisms that give rise to macroscopic behaviors of the systems. This theme is a series of joint works with different collaborators, but I will focus on my most recent work with Ethan Levien. Motivated by the applications of pattern recognition, image classification, network reconstruction, etc., we consider a modern Hopfield network (a spin glass model) whose configuration space is R^N. We randomly select M = exp(alpha*N) configurations that are independent and identically distributed according to the standard Gaussian distribution. We call these configurations “patterns”. Given a vector x^0 that is sufficiently close to a typical pattern, say xi^1, we analyze the convergence in probability of x^0 to xi^1. We will also discuss the local minima behavior of the corresponding energy landscape. |
|
1:00pm - RH 114 - Graduate Seminar Zhiqin Lu - (UCI) Hearing the shape of a triangle |
