Speaker: 

Lara Kassab

Institution: 

UCLA

Time: 

Monday, March 18, 2024 - 4:00pm to 5:00pm

Host: 

In this talk, we consider large-scale ranking problems where one is given a set of pairwise comparisons (e.g., item A is superior to item B) and we desire to find the underlying ranking based on those comparisons. We show that stochastic gradient descent approaches, particularly the Kaczmarz method, can be leveraged to offer convergence to a solution that reveals the underlying ranking while requiring low-memory operations. We introduce several variations of this approach that offer a tradeoff in speed and convergence when the pairwise comparisons are noisy. We prove theoretical results for convergence almost surely and study several regimes including those with full, partial, and noisy observations. Our empirical results give insights into the number of observations required as well as how much noise in those measurements can be tolerated.