Graphical Designs

Speaker: 

Stefan Steinerberger

Institution: 

University of Washington

Time: 

Wednesday, February 15, 2023 - 2:00pm to 3:00pm

Host: 

Location: 

510R Rowland Hall

Spherical Designs are very special points on the sphere with the property that the average of a low-degree polynomial over the points is the same as the global average of the polynomial on the sphere. As it turns out, the definition can be suitably interpreted to make sense on a finite combinatorial Graph as well.  The arising structures are breathtakingly pretty (many pictures will be shown). They can be interpreted as the analogue of Platonic bodies in graphs. Graphs can have many more symmetries than Euclidean space and, correspondingly, some of these point structures are remarkably symmetric. This is also naturally related to Extremal Combinatorics where classical Theorems (the Erdos-Ko-Rado Theorem or the Deza-Frankl theorem)  suddenly turn into beautiful special cases.   If we only consider the hypercube graph {0,1}^d, we naturally encounter problems from coding theory.   A probabilistic interpretation tells us new things about the speed with which random walks on the graph become random.   There will be pictures, a survey of recent results by C. Babecki, K. Golubev, D. Shiroma, R. Thomas and many, many open problems.

Community detection in hypergraphs

Speaker: 

Ioana Dumitriu

Institution: 

UCSD

Time: 

Wednesday, February 8, 2023 - 2:00pm to 3:00pm

Host: 

Location: 

510R Rowland Hall

The Hypergraph Stochastic Block Model (HSBM) is a complex, many-parameter model generalizing the well-known graph SBM--a paradigm for studying community detection. One of the main uses of the (H)SBM is that it allows us to develop guarantees for community detection algorithms by providing thresholds---parameter conditions that theoretically describe the necessary and sufficient requirements for community detection in the model. These thresholds naturally depend on the type of recovery desired.

I will talk about the most general conditions and thresholds we can develop, under various tweaks in the model, for almost exact and exact recovery in more, or less, general HSBM. This is joint work with Haixiao Wang and Yizhe Zhu. 

Around the rational exponents conjecture

Speaker: 

David Conlon

Institution: 

Caltech

Time: 

Wednesday, November 2, 2022 - 2:00pm

Host: 

Location: 

510R Rowland Hall

Given a natural number n and a graph H, the extremal number ex(n, H) is the largest number of edges in an H-free graph on n vertices. One of the outstanding questions concerning extremal numbers, originally posed by Erdos and Simonovits, is the rational exponents conjecture, which asks whether for every rational number r between 1 and 2, there is a graph H such that c n^r < ex(n, H) < C n^r for some positive constants c and C. We will discuss some of the recent progress on this conjecture.

Zero bias enhanced Stein couplings for normal approximation

Speaker: 

Larry Goldstein

Institution: 

USC

Time: 

Wednesday, October 19, 2022 - 2:00pm

Location: 

510R Rowland Hall

Stein's method for distributional approximation has become a valuable tool in probability and statistics by providing finite sample distributional bounds for a wide class of target distributions in a number of metrics. A key step in popular versions of the method involves making couplings constructions, and a family of couplings of Chen and Roellin vastly expanded the range of applications for which Stein's method for normal approximation could be applied. This family subsumes both Stein's classical exchangeable pair, and the size bias coupling. A further simple generalization includes zero bias couplings, and also allows for situations where the coupling is not exact. The zero bias versions result in bounds for which often tedious computations of a variance of a conditional expectation is not required. An example to the Lightbulb process shows that even though the method may be simple to apply, it may yield improvements over previous results that had achieved bounds with optimal rates and small, explicit constants. 

 

Non-backtracking spectra of random hypergraphs and community detection

Speaker: 

Yizhe Zhu

Institution: 

UCI

Time: 

Wednesday, October 12, 2022 - 2:00pm

Host: 

Location: 

510R Rowland Hall

The stochastic block model has been one of the most fruitful research topics in community detection and clustering. Recently, community detection on hypergraphs has become an important topic in higher-order network analysis. We consider the detection problem in a sparse random tensor model called the hypergraph stochastic block model (HSBM). We prove that a spectral method based on the non-backtracking operator for hypergraphs works with high probability down to the generalized Kesten-Stigum detection threshold conjectured by Angelini et al (2015). We characterize the spectrum of the non-backtracking operator for sparse random hypergraphs and provide an efficient dimension reduction procedure using the Ihara-Bass formula for hypergraphs. As a result, the community detection problem can be reduced to an eigenvector problem of a non-normal matrix constructed from the adjacency matrix and the degree matrix of the hypergraph. Based on joint work with Ludovic Stephan (EPFL). https://arxiv.org/abs/2203.07346

Colorings of the Middle Layers of the Hamming Cube

Speaker: 

Gwen McKinley

Institution: 

UCSD

Time: 

Wednesday, December 7, 2022 - 2:00pm to 3:00pm

Location: 

510R Rowland Hall

There is a rich and growing body of literature dedicated studying typical colorings, independent sets, and more general homomorphisms in regular bipartite graphs. Much of this literature has been devoted to the Hamming cube and the discrete torus, where very strong structural and enumerative results are known. However, a number of the techniques that have been used rely heavily on the specific structure of these graphs. Here, we consider the middle two layers of the Hamming cube, which have slightly less "nice structure" than the entire Hamming cube, and ask for the typical structure of a q-coloring (where q is any constant). When q is even, we prove analogous structural and enumerative results to those known for the Hamming cube. In this talk, I will discuss some of our techniques, and future directions to generalize this work to other graphs. This project is joint with Lina Li and Jinyoung Park.

Overcoming the curse of dimensionality for solving high-dimensional Hamilton-Jacobi partial differential equations or optimal control problems using neural networks

Speaker: 

Tingwei Meng

Institution: 

UCLA

Time: 

Wednesday, October 5, 2022 - 2:00pm

Host: 

Location: 

510R Rowland Hall

Hamilton-Jacobi PDEs and optimal control problems are widely used in many practical problems in control engineering, physics, financial mathematics, and machine learning. For instance, controlling an autonomous system is important in everyday modern life, and it requires a scalable, robust, efficient, and data-driven algorithm for solving optimal control problems. Traditional grid-based numerical methods cannot solve these high-dimensional problems, because they are not scalable and may suffer from the curse of dimensionality. To overcome the curse of dimensionality, we developed several neural network methods for solving high-dimensional Hamilton-Jacobi PDEs and optimal control problems. This talk will contain two parts. In the first part, I will talk about SympOCNet method for solving multi-agent path planning problems, which solves a 512-dimensional path planning problem with training time of less than 1.5 hours. In the second part, I will show several neural network architectures with solid theoretical guarantees for solving certain classes of high-dimensional Hamilton-Jacobi PDEs. By leveraging dedicated efficient hardware designed for neural networks, these methods have the potential for real-time applications in the future. These are joint works with Jerome Darbon, George Em Karniadakis, Peter M. Dower, Gabriel P. Langlois, and Zhen Zhang.

Strong freezing of the binary perceptron model

Speaker: 

Shuangping Li

Institution: 

Stanford University

Time: 

Wednesday, October 26, 2022 - 2:00pm to 3:00pm

Host: 

Location: 

510R Rowland Hall

We consider the binary perceptron model, a simple model of neural networks that has gathered significant attention in the statistical physics, information theory and probability theory communities. We show that at low constraint density (m=n^{1-epsilon}), the model exhibits a strong freezing phenomenon with high probability, i.e. most solutions are isolated. We prove it by a refined analysis of the log partition function. Our proof technique relies on a second moment method and cluster expansions. This is based on joint work with Allan Sly.

Detecting Hidden Communities by Power Iterations with Connections to Vanilla Spectral Algorithms

Speaker: 

Jiapeng Zhang

Institution: 

USC

Time: 

Wednesday, November 30, 2022 - 2:00pm to 3:00pm

Host: 

Location: 

510R Rowland Hall

Community detection in the stochastic block model is one of the central problems of graph clustering. In this setup, spectral algorithms have been one of the most widely used frameworks for the design of clustering algorithms. However, despite the long history of study, there are still unsolved challenges. One of the main open problems is the design and analysis of ``simple'' spectral algorithms, especially when the number of communities is large. 

In this talk, I will discuss two algorithms. The first one is based on the power-iteration method. Our algorithm performs optimally (up to logarithmic factors) compared to the best known bounds in the dense graph regime by Van Vu (Combinatorics Probability and Computing, 2018). 

Then based on a connection between the powered adjacency matrix and eigenvectors, we provide a ``vanilla'' spectral algorithm for large number of communities in the balanced case. Our spectral algorithm is as simple as PCA (principal component analysis).

This talk is based on joint works with Chandra Sekhar Mukherjee. (https://arxiv.org/abs/2211.03939)

Estimation of the covariance matrix in the presence of outliers

Speaker: 

Nikita Zhivotovskiy

Institution: 

UC Berkeley

Time: 

Wednesday, November 16, 2022 - 2:00pm to 3:00pm

Host: 

Location: 

510R Rowland Hall

Suppose we are observing a sample of independent random vectors, knowing that the original distribution was contaminated, so that a fraction of observations came from a different distribution. How to estimate the covariance matrix of the original distribution in this case? In this talk, we discuss an estimator of the covariance matrix that achieves the optimal dimension-free rate of convergence under two standard notions of data contamination: We allow the adversary to corrupt a fraction of the sample arbitrarily, while the distribution of the remaining data points only satisfies a certain (rather weak) moment equivalence assumption. Despite requiring the existence of only a few moments, our estimator achieves the same tail estimates as if the underlying distribution were Gaussian. Based on a joint work with Pedro Abdalla.

Pages

Subscribe to RSS - Combinatorics and Probability