Variation of no-three-in-line problem

Speaker: 

Ji Zeng

Institution: 

UCSD

Time: 

Wednesday, March 6, 2024 - 2:00pm to 3:00pm

Host: 

Location: 

510R Rowland Hall

The famous no-three-in-line problem by Dudeney more than a century ago asks whether one can select $2n$ points from the grid $[n]^2$ such that no three are collinear. We present two results related to this problem. First, we give a non-trivial upper bound for the maximum size of a set in $[n]^4$ such that no four are coplanar. Second, we characterize the behavior of the maximum size of a subset such that no three are collinear in a random set of $\mathbb{F}_q^2$, that is, the plane over the finite field of order $q$. We discuss their proofs and related open problems.

A Theory of Neuronal Synaptic Balance

Speaker: 

Pierre Baldi

Institution: 

UCI

Time: 

Wednesday, February 7, 2024 - 2:00pm

Host: 

Location: 

510R Rowland Hall

AI today can pass the Turing test and is in the process of transforming science, technology, society, humans, and beyond.
Surprisingly, modern AI is built out of two very simple and old ideas, rebranded as deep learning: neural networks and
gradient descent learning. When a typical feed-forward neural network is trained by gradient descent, with an L2 regularizer
to avoid overly large synaptic weights, a strange phenomenon occurs: at the optimum, each neuron becomes "balanced"
in the sense that the L2 norm of its incoming synaptic weights becomes equal to the L2 norm of its outgoing synaptic weights. We develop a theory that explains this phenomenon and exposes its generality. Balance emerges with a variety of activation functions, a variety of regularizers including all Lp regularizers, and a variety of networks including recurrent networks. A simple local balancing algorithm can be applied to any neuron and at any time, instead of just at the optimum. Most remarkably, stochastic iterated application of the local balancing algorithm always converges to a unique, globally balanced, state.

Homological Percolation on a Torus

Speaker: 

Paul Duncan

Institution: 

Hebrew University of Jerusalem

Time: 

Wednesday, January 31, 2024 - 2:00pm to 3:00pm

Host: 

Location: 

510R Rowland Hall

Many well-studied properties of random graphs have interesting generalizations to higher dimensions in random simplicial complexes. We will discuss a version of percolation in which, instead of edges, we add two (or higher)-dimensional cubes to a subcomplex of a large torus at random. In this setting, we see a phase transition that marks the appearance of giant "sheets," analogous to the appearance of giant components in graph models. This phenomenon is most naturally described in the language of algebraic topology, but this talk will not assume any topological background. 

Based on joint work with Ben Schweinhart and Matt Kahle.

Erdos-Kac Central Limit Theorem

Speaker: 

Michael Cranston

Institution: 

UCI

Time: 

Wednesday, February 21, 2024 - 2:00pm

Host: 

Location: 

510R Rowland Hall

The Erdos-Kac Central Limit Theorem says that if one selects an integer at random from 1 to N, then the number of distinct prime divisors of this number satisfies a Central Limit Theorem. We (the speaker in joint work with Tom Mountford) give new proof of this result using the Riemann zeta distribution and a Tauberian Theorem. The proof generalizes easily to other situations such as polynomials over a finite field or ideals in a number field.

Expansion of random 0/1 polytopes and the Mihail and Vazirani conjecture

Speaker: 

Luis Rademacher

Institution: 

UC Davis

Time: 

Wednesday, March 13, 2024 - 2:00pm

Host: 

Location: 

510R Rowland Hall

A 0/1 polytope is the convex hull of a set of 0/1 d-dimensional vectors. A conjecture of Milena Mihail and Umesh Vazirani says that the graph of vertices and edges of every 0/1 polytope is highly connected. Specifically, it states that the edge expansion of the graph of every 0/1 polytope is at least one. Any lower bound on the edge expansion gives an upper bound for the mixing time of a random walk on the graph of the polytope. Such random walks are important because they can be used to generate an element from a set of combinatorial objects uniformly at random. A weaker form of the conjecture of Mihail and Vazirani says that the edge expansion of the graph of a 0/1 polytope in R^d is greater than 1 over some polynomial function of d. This weaker version of the conjecture would suffice for all applications. Our main result is that the edge expansion of the graph of a random 0/1 polytope in R^d is at least 1/12d with high probability. This is joint work with Brett Leroux.

Asymmetry helps: Non-backtracking spectral methods for sparse matrices and tensors

Speaker: 

Yizhe Zhu

Institution: 

UCI

Time: 

Wednesday, October 18, 2023 - 2:00pm to 3:00pm

Location: 

510R Rowland Hall

The non-backtracking operator is a non-Hermitian matrix associated with an undirected graph. It has become a powerful tool in the spectral analysis of random graphs. Recently, many new results for sparse Hermitian random matrices have been proved for the corresponding non-backtracking operator and translated into a statement of the original matrices through the Ihara-Bass formula. In another direction, efficient algorithms based on the non-backtracking matrix have successfully reached optimal sample complexity in many sparse low-rank estimation problems. I will talk about my recent work with the help of the non-backtracking operator. This includes the Bai-Yin law for sparse random rectangular matrices, hypergraph community detection, and tensor completion. 

The Ramsey number r(4,t)

Speaker: 

Jacques Verstraete

Institution: 

UCSD

Time: 

Wednesday, October 11, 2023 - 2:00pm to 3:00pm

Host: 

Location: 

510R Rowland Hall

In this talk, I will outline the proof that $r(4,t) = t^{3 - o(1)}$, which solves a long-standing open problem of Erdos. The proof combines a variety of techniques, and lends itself to improving bounds on various other Ramsey numbers $r(F,t)$ where $F$ is a fixed graph., as well as applications to a variety of extremal and Ramsey problems.

Set representation of sparse graphs and hypergraphs

Speaker: 

Marcelo Sales

Institution: 

UCI

Time: 

Wednesday, November 8, 2023 - 2:00pm to 3:00pm

Host: 

Location: 

510R Rowland Hall

Let $G$ be a graph on $n$ vertices. Given an integer $t$, we say that a family of sets $\{A_x\}_{x \in V}\subset 2^{[t]}$ is a set representation of $G$ if $$xy \in E(G) \iff A_x \cap A_y = \emptyset$$ Let $\overline{\theta}_1(G)$ be the smallest integer $t$ with such representation. In this talk I will discuss some of the bounds on $\overline{\theta}_1$ for sparse graphs and related problems. Joint work with Basu, Molnar, Rödl and Schacht.

Probabilistic laws on infinite groups

Speaker: 

Gil Goffer

Institution: 

UCSD

Time: 

Wednesday, November 15, 2023 - 2:00pm to 3:00pm

Host: 

Location: 

510R Rowland Hall

In various cases, a law that holds in a group with high probability, must actually hold for all elements. For instance, a finite group in which the commutator law [x,y]=1 holds with probability at least 5/8, must be abelian. For infinite groups, one needs to work a bit harder to define the probability that a given law holds. One natural way is by sampling a random element uniformly from the r-ball in the Cayley graph and taking r to infinity; another way is by sampling elements using random walks. It was asked by Amir, Blachar, Gerasimova, and Kozma whether a law that holds with probability 1, must actually hold globally, for all elements. In a recent joint work with Be’eri Greenfeld, we give a negative answer to their question.

In this talk I will give an introduction to probabilistic group laws and present a finitely generated group that satisfies the law x^p=1 with probability 1, but yet admits no group law that holds for all elements.

Deep Generative Models in Infinite-Dimensional Spaces

Speaker: 

Gavin Kerrigan

Institution: 

UCI

Time: 

Wednesday, November 29, 2023 - 2:00pm to 3:00pm

Host: 

Location: 

510R Rowland Hall

Deep generative models have seen a meteoric rise in capabilities across a wide array of domains, ranging from natural language and vision to scientific applications such as precipitation forecasting and molecular generation. However, a number of important applications focus on data which is inherently infinite-dimensional, such as time-series, solutions to partial differential equations, and audio signals. This relatively under-explored class of problems poses unique theoretical and practical challenges for generative modeling. In this talk, we will explore recent developments for infinite-dimensional generative models, with a focus on diffusion-based methodologies. 

Pages

Subscribe to RSS - Combinatorics and Probability