|
2:00pm to 3:00pm - 340P Rowland Hall - Combinatorics and Probability Can M. Le - (UC Davis) Variational Inference: Posterior Threshold Improves Network Clustering Accuracy in Sparse Regimes Variational inference has been widely used in machine learning literature to fit various Bayesian models. In network analysis, this method has been successfully applied to solve community detection problems. Although these results are promising, their theoretical support is limited to relatively dense networks, an assumption that may not hold for real networks. In addition, recent studies have shown that the variational loss surface has many saddle points, which can significantly impact its performance, especially when applied to sparse networks. This paper proposes a simple method to improve the variational inference approach by hard thresholding the posterior of the community assignment after each iteration. We show that the proposed method can accurately recover the true community labels, even when the average node degree of the network is bounded and the initialization is arbitrarily close to random guessing. An extensive numerical study further confirms the advantage of the proposed method over the classical variational inference and other algorithms. |
|
4:00pm to 5:00pm - RH 306 - Applied and Computational Mathematics Longxiu Huang - (Michigan State University) Flexible Cross-Sampling Sampling for Efficient Matrix/Tensor Recovery Sampling plays a central role in recovering low-rank structures from incomplete data. In this talk, I will present a unified framework of cross-concentrated sampling (CCS) and its tensor analog (t-CCS), which bridges the gap between unbiased sampling and CUR sampling. First, in the matrix setting, we consider a rank-r matrix and introduce a sampling scheme in which we select subsets of rows and columns and then sample within these “cross-concentrated” submatrices — thereby interpolating between uniform entry-sampling and full CUR sampling. We establish sufficient conditions (under incoherence) guaranteeing exact recovery via a tailored non-convex iterative CUR-completion algorithm (ICURC). Second, broadening to third-order tensors, we extend the CCS concept to the tubal-rank tensor framework: selecting lateral and horizontal subtensors of a low-tubal-rank tensor, sampling within them, and developing a non-convex solver (ITCURTC) with provable guarantees. Notably, we derive explicit constants in the sufficient sampling condition, and demonstrate the empirical advantages of t-CCS over classical Bernoulli or full t-CUR sampling in real datasets. The talk will highlight the underlying mathematical principles, algorithmic developments, and numerical validation, illustrating how flexible sampling models provide a unified foundation for matrix and tensor recovery. |
|
1:00pm to 2:00pm - RH 340N - Dynamical Systems Victor Kleptsyn - (CNRS, University of Rennes 1, France) Introduction to the Maslov index and its relation to the discrete Schrödinger operators This is an introductory talk around the general theme of Schrödinger operators and their transfer matrices. I will start discussing on the relation between density of states for discrete Schrödinger operators on Z and the rotation number, and we will see what happens when one studies a finite width band instead: symplectic operators and Maslov indices. |
|
2:00pm to 3:00pm - RH306 - Applied and Computational Mathematics Kevin Lin - (Department of Mathematics, University of Arizona) Efficient computational modeling of cortical circuits via coarse-grained interactions Biologically realistic models of cortical circuits are challenging to build, tune, and analyze due to the large numbers of neurons and their complex interactions. Reduced, or coarse-grained, models are more tractable, but there is a nontrivial trade-off between tractability and biological fidelity / interpretability. In this talk, I will describe a coarse-graining strategy inspired by ideas from nonequilibrium statistical mechanics. The aim is to balance biological realism and computational efficiency. I will illustrate how this strategy applies to the primate primary visual cortex. This is joint work with Zhou-Cheng Xiao and Lai-Sang Young. Note: this is joint CMCF seminar |
|
3:00pm to 4:00pm - 306 Rowland Hall - Differential Geometry Jiyuan Han - (Westlake University) On the existence of weighted-cscK metrics Weighted-cscK metrics provide a universal framework for the study of canonical metrics, e.g., extremal metrics, Kahler-Ricci soliton metrics, \mu-cscK metrics. In joint works with Yaxiong Liu, we prove that on a Kahler manifold X, the G-coercivity of weighted Mabuchi functional implies the existence of weighted-cscK metrics. In particular, there exists a weighted-cscK metric if X is a projective manifold that is weighted K-stable for models. We will also discuss some progress on singular varieties. |
|
4:00pm to 5:00pm - 306 Rowland Hall - Differential Geometry Zhihan Wang - (Cornell University) Geometry of Mean Curvature Flow near Cylindrical Singularities A central question in geometric flows is to understand the formation of singularities. In this talk, I will focus on mean curvature flow, the negative gradient flow of area functional, and explain how the local dynamics influence the shape of the flow and its singular set near a cylindrical singularity, as well as how the topology of the flow changes after passing through such a singularity with generic dynamics. This talk is based on the joint works with Ao Sun and Jinxin Xue. |
|
9:00am to 9:50am - Zoom - Inverse Problems Josselin Garnier - (Ecole Polytechnique) Sequential design for surrogate modeling in Bayesian inverse problems |
|
1:00pm to 1:50pm - RH 340N - Algebra Adam Yassine - (Pomona College) Compositionality using Generalized Span Categories Span categories provide an abstract framework for formalizing mathematical models of certain systems. The mathematical descriptions of some systems, such as classical mechanical systems, require categories that do not have pullbacks and this limits the utility of span categories as a formal framework. If $\mathcal{C}$ and $\mathcal{C} ′$ are categories and $F$ is a functor from $\mathcal{C}$ to $\mathcal{C} ′$, we introduce the notion of an $F$-pullback of a cospan in $\mathcal{C}$ as well as the notion of span tightness of $F$. If $F$ is span tight, then we can form a generalized span category $Span(\mathcal{C} , F)$, which allows us to find a way around the technical difficulty of $\mathcal{C}$ failing to have pullbacks. |
