# Finite-rank perturbations of random band matrices via infinitesimal free probability

Benson Au

UCSD

## Time:

Tuesday, October 8, 2019 - 11:00am to 12:00pm

## Location:

RH 340N

Free probability provides a unifying framework for studying random multi-matrix models in the large $N$ limit. Typically, the purview of these techniques is limited to invariant or mean-field ensembles. Nevertheless, we show that random band matrices fit quite naturally into this framework. Our considerations extend to the infinitesimal level, where finer results can be stated for the $\frac{1}{N}$ correction. As an application, we consider the question of outliers for finite-rank perturbations of our model. In particular, we find outliers at the classical positions from the deformed Wigner ensemble. No background in free probability is assumed.

# Finding perfect matchings in hypergraphs

Asaf Ferber

UCI

## Time:

Tuesday, September 24, 2019 - 11:00am to 12:00pm

## Location:

RH 340N

In this talk we will survey some open problems and recent results in probabilistic and extremal combinatorics related to finding perfect matchings in hypergraphs. The general plan (if time permits) is to motivate the general problem, discuss it for graphs, explain the difficulty when passing to hypergraphs, relate the basic problem to an unsolved probabilistic inequality which was conjectured by Feige, sketch some useful tools for tackling these problems, and report on some recent developments in the area.

# A combinatorial approach to the quantitative invertibility of random matrices

Vishesh Jain

MIT

## Time:

Tuesday, October 22, 2019 - 11:00am to 12:00pm

## Location:

RH 340N

Let s_n(M) denote the smallest singular value of an n x n random (possibly complex) matrix M. We will discuss a novel combinatorial approach (in particular, not using either inverse Littlewood--Offord theory or net arguments) for obtaining upper bounds on the probability that s_n(M) is smaller than a given number for quite general random matrix models. Such estimates are a fundamental part of the non-asymptotic theory of random matrices and have applications to the strong circular law, numerical linear algebra etc. In several cases of interest, our approach provides stronger bounds than those obtained by Tao and Vu using inverse Littlewood-Offord theory.

# Eigenvalue gaps of sparse random matrices

Kyle Luh

## Institution:

Harvard University

## Time:

Tuesday, October 15, 2019 - 11:00am to 12:00pm

## Location:

RH 340N

We will discuss some recent work on quantifying the gaps between eigenvalues of sparse random matrices.  Before these results, it was not even known if the eigenvalues were distinct for this class of random matrices.  We will also touch upon how these results relate to random graphs, the graph isomorphism problem and nodal domains.  This is joint work with Van Vu and Patrick Lopatto.

# Analyzing Hybrid Randomized and Greedy Projection Methods

UCLA

## Time:

Tuesday, October 1, 2019 - 11:00am to 12:00pm

## Location:

RH 340N

Stochastic iterative algorithms have gained recent interest for solving large-scale systems of equations, Ax=y. One such example is the Randomized Kaczmarz (RK) algorithm, which acts only on single rows of the matrix A at a time. While RK randomly selects a row, Motzkin's algorithm employs a greedy row selection; meanwhile the Sampling Kaczmarz-Motzkin (SKM) algorithm combines these two strategies. In this talk, we present an improved convergence analysis for SKM which interpolates between RK and Motzkin's algorithm.

# Invertibility of adjacency matrices of random graphs

Mark Rudelson

## Institution:

University of Michigan

## Time:

Tuesday, May 7, 2019 - 11:00am

## Location:

RH 510M

Consider an adjacency matrix of a bipartite, directed, or undirected Erdos-Renyi random graph. If the average degree of a vertex is large enough, then such matrix is invertible with high probability. As the average degree decreases, the probability of the matrix being singular increases, and for a sufficiently small average degree, it becomes singular with probability close to 1. We will discuss when this transition occurs, and what the main reason for the singularity of the adjacency matrix is. This is a joint work with Anirban Basak.

# Hashing II. Applications.

Roman Vershynin

UCI

## Time:

Tuesday, February 26, 2019 - 11:00am to 11:50am

## Location:

306 RH

This talk will focus on applications of hashing. We use Leftover Hash Lemma to count linearly independent polynomials defined on a given set. From this we will derive a recent result of Abbe, Shpilka and Wigderson on linear independence of random tensors. Unfortunately, methods based on hashing only work over finite fields. A totally different approach to random tensors was found by Pierre Baldi and myself, which I will explain in detail.

# Hashing

Roman Vershynin

UCI

## Time:

Tuesday, February 19, 2019 - 11:00am to 11:50am

## Location:

306 RH

Hashing is a technique widely used in coding theory (an area of computer science) and in cryptography. Although hashing is an interesting mathematical object, it is surprisingly little known to the "mainstream" mathematicians. I will focus on one specific result on hasing, namely the Leftover Hash Lemma. We will state it as a result in extremal combinatorics, give a probabilistic proof of it, and relate it to another fundamental result in extremal combinatorics, the Sauer-Shelah Lemma.

# The conjugate gradient algorithm on random matrices

Tom Trogdon

UC Irvine

## Time:

Tuesday, April 30, 2019 - 11:00am

## Location:

RH 510M

Numerical analysis and random matrix theory have long been coupled, going (at least) back to the seminal work of Goldstine and von Neumann (1951) on the condition number of random matrices. The works of Trotter (1984) and Silverstein (1985) incorporate "numerical techniques" to assist in the analysis of random matrices. One can also consider the problem of computing distributions (e.g. Tracy-Widom) from random matrix theory. In this talk, I will discuss a different numerical analysis problem: using random matrices to analyze the runtime (or halting time) of numerical algorithms.  I will focus on recent results for the conjugate gradient algorithm including a proof that the halting time is almost deterministic.