Week of June 4, 2023

Mon Jun 5, 2023
3:30pm - RH 340N - Geometry and Topology
Yu-Shen Lin - (Boston University)
On the moduli spaces of ALH*-gravitational instantons

Gravitational instantons are defined as non-compact hyperKahler
4-manifolds with L^2 curvature decay. They are all bubbling limits of K3
surfaces and thus serve as stepping stones for understanding the K3 metrics.
In this talk, we will focus on a special kind of them called
ALH*-gravitational instantons. We will explain the Torelli theorem, describe
their moduli spaces and some partial compactifications of the moduli spaces.
This talk is based on joint works with T. Collins, A. Jacob, R. Takahashi,
X. Zhu and S. Soundararajan.

Earlier time and joint seminar with Differential Geometry Seminar.

3:30pm - RH 340N - Differential Geometry
Yu-Shen Lin - (Boston University)
On the moduli spaces of ALH*-gravitational instantons

Gravitational instantons are defined as non-compact hyperKahler
4-manifolds with L^2 curvature decay. They are all bubbling limits of K3
surfaces and thus serve as stepping stones for understanding the K3 metrics.
In this talk, we will focus on a special kind of them called
ALH*-gravitational instantons. We will explain the Torelli theorem, describe
their moduli spaces and some partial compactifications of the moduli spaces.
This talk is based on joint works with T. Collins, A. Jacob, R. Takahashi,
X. Zhu and S. Soundararajan.

Special date/time and joint with Geometry and Topology Seminar.

4:00pm to 5:00pm - RH 306 - Applied and Computational Mathematics
Shuhao Cao - (University of Missouri–Kansas City)
Structure-conforming operator learning via Transformers

GPT, Stable Diffusion, AlphaFold 2, etc, all these state-of-the-art deep learning models use a neural architecture called "Transformers". Since the emergence of "Attention Is All You Need", Transformer is now the ubiquitous architecture in deep learning. At Transformer's heart and soul is the "attention mechanism". In this talk, we shall give a specific example of a fundamental but critical question: whether and how one can benefit from the theoretical structure of a mathematical problem to develop task-oriented and structure-conforming deep neural networks? An attention-based deep direct sampling method is proposed for solving Electrical Impedance Tomography, a class of boundary value inverse problems. Progresses within different communities to answer some open problems on the mathematical properties of the attention mechanism in Transformers will be briefed. This is joint work with Ruchi Guo (UC Irvine) and Long Chen (UC Irvine).

Tue Jun 6, 2023
4:00pm - ISEB 1200 - Differential Geometry
Siu-Cheong Lau - (Boston University)
Quivers, stacks, and mirror symmetry

In this talk, we will start by introducing quiver representations
and some of their applications.  Then we will review noncommutative crepant
resolutions of singularities of Van den Bergh.  We will find that the notion
of quiver stacks will be useful in unifying geometric and quiver
resolutions.  Finally, we will explain our motivation and construction of
these quiver stacks from a symplectic mirror point of view.

Thu Jun 8, 2023
9:00am to 9:50am - Zoom - Inverse Problems
Jianliang Qian - (Michigan State University)
A Fast Microlocal Hadamard-Babich Integrator for High-Frequency Helmholtz Equations in Inhomogeneous Media with Arbitrary Sources

https://sites.uci.edu/inverse/