12:00pm - Zoom: https://sites.google.com/view/paw-seminar - Probability and Analysis Webinar Andreas Defant - (Carl von Ossietzky Universität Oldenburg) TBA |
4:00pm to 5:00pm - https://uci.zoom.us/j/98883924034 - Applied and Computational Mathematics Jonathan Siegel - (The Pennsylvania State University) Approximation Theory and Metric Entropy of Neural Networks We consider the problem of approximating high dimensional functions using shallow neural networks, and more generally by sparse linear combinations of elements of a dictionary. We begin by introducing natural spaces of functions which can be efficiently approximated in this way. Then, we derive the metric entropy of the unit balls in these spaces, which allows us to calculate optimal approximation rates for approximation by shallow neural networks. Next, we show that higher approximation rates can be obtained by further restricting the function class under consideration. In particular, on a restrictive but natural space of functions, shallow networks with ReLU$^k$ activation function achieve an approximation rate of $O(n^{-(k+1)})$ in every dimension. Finally, we discuss the connections between this surprising result and the finite element method. Zoom link https://uci.zoom.us/j/98883924034
|
9:00am to 10:00am - Zoom - Inverse Problems Victor Isakov - (Wichita State University) On increasing stability and minimal data in inverse problems |