Uniform laws of large numbers provide theoretical foundations for statistical learning theory. This talk will focus on quantitative uniform laws of large numbers for random matrices. A range of illustrations will be given in high dimensional geometry and data science.
In this talk I will illustrate my ideas and plans about the development of an undergraduate curriculum in the broader area of data science that includes, among other things, a course in image
processing. I will give an overview of the field, discuss typical problems that are studied within the discipline, and present an array of applications in medicine, astronomy, atmospheric science, security, navigation systems, and others: this will include a brief exposition of my own research in the recovery of images from videos affected by optical turbulence. I will be drawing ideas from my own experience in teaching courses and doing research with undergraduates at different academic institutions.
I will talk about the use of peers to enhance learning in three different contexts. The first context is a flipped integral calculus course. Students are expected to prepare for class ahead of time by watching video(s) and taking online quizzes. The instructor accesses the quiz data before class and uses student responses to tailor the classroom instruction. In-class time focuses on extending student understanding with a variety of active learning techniques, including peer-to-peer instruction. I will report the data we have collected about the impact of this experience on both student attitudes and learning.
The second context is a summer online bridge program for incoming students. We utilize undergraduate coach/mentors to meet online virtually with a team of 4-5 incoming students throughout the summer to help close some of their mathematical gaps. I will describe the design of this program, how it enhances Yale's desire to recruit and retain a diverse student body, and the impact it has on student attitudes and learning. I will also highlight data that describes the impact of peer coaches on both learning and the motivation to learn.
The third context is a systematic supervised reading/research program for ~1200 math majors at UC Irvine. I will provide some suggestions for how this program might be structured to leverage advanced undergraduates and graduate students to help motivated math majors.
For 3 years I served as the Director of Jump Labs, a new endeavor for cutting-edge research and recruiting launched by Jump Trading, a quantitative high frequency trading firm based in Chicago.
Jump Labs sponsors research in high performance computing and data science via gifts grants involving:
Mentors from Jump Trading and Jump Venture Capital portfolio companies who guide the research along with University of Illinois professors
Jump Trading proprietary data: ~50 PB of historical market microstructure data from 60 exchanges around the world
Supercomputer grid resources
Office space at Jump Labs in the University of Illinois Research Park
The crux is to create a long term and powerful pipeline for talent acquisition by challenging the faculty and students with real-world problems. The structure aligns relevant industrial research with the passions and expertise of the faculty member and students. Opportunities for publication are encouraged. In our first two years we sponsored over 60 undergraduate and graduate students and 20 professors spanning 25 projects. The structure seeks to advance relevant research and creates a powerful recruiting pipeline for talent that is long term and low risk.
We will discuss the successes and challenges encountered at Jump Labs in its first three years.
Undergraduates are curious about research in mathematics: what kinds of questions do mathematicians ask, what does research entail, how do you begin to solve a new problem. In this talk, we will discuss integrating undergraduate research projects inside the classroom and how to expose students to new mathematical questions in both upper and lower division courses. We will then talk more generally about setting students up for success in the classroom.
Proximal algorithms offer state of the art performance for many large scale optimization problems. In recent years, the proximal algorithms landscape has simplified, making the subject quite accessible to undergraduate students. Students are empowered to achieve impressive results in areas such as image and signal processing, medical imaging, and machine learning using just a page or two of Python code. In this talk I'll discuss my experiences teaching proximal algorithms to students in the Physics and Biology in Medicine program at UCLA. I'll also share some of my teaching philosophy and approaches to teaching undergraduate math courses. Finally, I'll discuss my own research in optimization algorithms for radiation treatment planning, which is a fruitful source of undergraduate research projects.
Mathematical phase retrieval is the problem of solving systems of rank-1 quadratic equations. Over the last few years, there has been much interest in constructing algorithms with provable guarantees. Both theoretically and empirically, the most successful approaches have involved direct optimization of non-convex loss functions. In the first half of this talk, we will discuss how stochastic gradient descent for one of these loss functions provably results in (rapid) linear convergence with high probability. In the second half of the talk, we will discuss a semidefinite programming algorithm that simultaneously makes use of a sparsity prior on the solution vector, while overcoming possible model misspecification.
Given a planar infinity harmonic function u, for each
$\alpha>0$ we show a quantitative $W^{1,\,2}_{\loc}$-estimate of
$|Du|^{\alpha}$, which is sharp when $\alpha\to 0$. As a consequence we
obtain an $L^p$-Liouville property for infinity harmonic functions in
the whole plane