Can we spot a fake?

Speaker: 

Roman Vershynin

Institution: 

UC Irvine

Time: 

Wednesday, November 20, 2024 - 3:00pm to 4:00pm

Location: 

Rowland Hall 510R

The ongoing AI revolution is transforming human-computer interaction. But not all interaction is good: generative AI has made it easy to create fake data: images, news, and soon videos. But what is a fake? Can we define it mathematically? Which fakes can we detect and which cannot? I will suggest a simple probabilistic framework, where these questions can lead to concrete open (and fun!) problems in mathematics. This talk is based on a joint work with Shahar Mendelson and Grigoris Paouris: https://arxiv.org/abs/2410.18880

Sums of dilates

Speaker: 

Jeck Lim

Institution: 

Caltech

Time: 

Wednesday, October 30, 2024 - 3:00pm to 4:00pm

Host: 

Location: 

510R Rowland Hall

For any subset $A$ of a commutative ring $R$ (or, more generally, an $R$-module $M$) and any elements $\lambda_1, \dots, \lambda_k$ of $R$, let

\[\lambda_1 \cdot A + \cdots + \lambda_k \cdot A = \{\lambda_1 a_1 + \cdots + \lambda_k a_k : a_1, \dots, a_k \in A\}.\]

Such sums of dilates have attracted considerable attention in recent years, with the basic problem asking for an estimate on the minimum size of $|\lambda_1 \cdot A + \cdots + \lambda_k \cdot A|$ given $|A|$. In this talk, I will discuss various generalizations and settings of this problem, and share recent progress. This is based on joint work with David Conlon.

Independence number of hypergraphs under degree conditions

Speaker: 

Marcelo Sales

Institution: 

UCI

Time: 

Wednesday, October 23, 2024 - 3:00pm to 4:00pm

Host: 

Location: 

510R Rowland Hall

A well-known result of Ajtai et al. from 1982 states that every $k$-graph $H$ on $n$ vertices, with girth at least five, and average degree $t^{k-1}$ contains an independent set of size $c n \frac{(\log t)^{1/(k-1)}}{t}$ for some $c>0$. In this talk, we explore a related problem where we relax the girth condition, allowing certain cycles of length 2, 3, and 4. We will also present lower bounds on the size of independent sets in hypergraphs under specific degree conditions. This is joint work with Vojtěch Rödl and Yi Zhao. 

Corners in Quasirandom Groups via Sparse Mixing

Speaker: 

Anthony Ostuni

Institution: 

UCSD

Time: 

Wednesday, December 4, 2024 - 3:00pm to 4:00pm

Host: 

Location: 

510R Rowland Hall

We improve the best known upper bounds on the density of corner-free sets over quasirandom groups from inverse poly-logarithmic to quasi-polynomial. We make similarly substantial improvements to the best known lower bounds on the communication complexity of a large class of permutation functions in the 3-player Number-on-Forehead model. Underpinning both results is a general combinatorial theorem that extends the recent work of Kelley, Lovett, and Meka (STOC'24), itself a development of ideas from the breakthrough result of Kelley and Meka on three-term arithmetic progressions (FOCS'23).

Randomly sparsified Richardson iteration is really fast

Speaker: 

Robert Webber

Institution: 

UCSD

Time: 

Wednesday, November 6, 2024 - 3:00pm to 4:00pm

Location: 

510R Rowland Hall

Recently, a class of algorithms combining classical fixed point iterations with repeated random sparsification of approximate solution vectors has been successfully applied to eigenproblems with matrices as large as 10^{108} x 10^{108}. So far, a complete mathematical explanation for their success has proven elusive. The family of random sparsification methods has not yet been extended to the important case of linear system solves. This talk proposes a new algorithm based on repeated random sparsification that is capable of solving linear systems in extremely high dimensions and provides a complete mathematical analysis of the new algorithm. The analysis establishes a faster-than-Monte Carlo convergence rate and justifies use of the scheme even when the solution vector is too large to store.

Fairness and Foundations in Machine Learning

Speaker: 

Deanna Needell

Institution: 

UCLA

Time: 

Wednesday, November 13, 2024 - 3:00pm to 4:00pm

Location: 

Rowland Hall 510 R

In this talk, we will address areas of recent work centered around the themes of fairness and foundations in machine learning as well as highlight the challenges in this area. We will discuss recent results involving linear algebraic tools for learning, such as methods in non-negative matrix factorization that include tailored approaches for fairness. We will showcase our derived theoretical guarantees as well as practical applications of those approaches.  Then, we will discuss new foundational results that theoretically justify phenomena like benign overfitting in neural networks.  Throughout the talk, we will include example applications from collaborations with community partners, using machine learning to help organizations with fairness and justice goals. 

Introduction to Robust Statistics III

Speaker: 

Pedro Abdalla

Institution: 

UCI

Time: 

Wednesday, October 16, 2024 - 3:00pm to 4:00pm

Host: 

Location: 

RH 510R

Robust Statistics is a classical topic that dates back to the seminal work of Huber in the 1980s. In essence, the main goal of the field is to account for the effect of outliers when performing estimation tasks, such as mean estimation. A recent line of research, inspired by the seminal work of Catoni, has revisited some classical problems in robust statistics from a non-asymptotic perspective. The goal of this short seminar series is to introduce the key ideas related to robust estimation and discuss various notions of robustness, including heavy-tailed distributions and adversarial contamination. The primary example will be the mean estimation problem, and if time permits, I will also cover covariance estimation

Pages

Subscribe to RSS - Combinatorics and Probability