In this presentation, we present a simple high order weighted essentially non- oscillatory (WENO) schemes to solve hyperbolic conservation laws. The main advantages of these schemes presented in the paper are their compactness, robustness and could maintain good convergence property for solving steady state problems. Comparing with the classical WENO schemes by {G.-S. Jiang and C.-W. Shu, J. Comput. Phys., 126 (1996), 202-228}, there are two major advantages of the new WENO schemes. The first, the associated optimal linear weights are inde- pendent on topological structure of meshes, can be any positive numbers with only requirement that their summation equals to one, and the second is that the new scheme is more compact and efficient than the scheme by Jiang and Shu. Extensive numerical results are provided to illustrate the good performance of these new WENO schemes.
We consider a family of regression problems in a semi-supervised setting. Given real-valued labels on a small subset of data the task is to recover the function on the whole data set while taking advantage of the (geometric) structure provided by the large number of unlabeled data points. We consider a random geometric graph to represent the geometry of the data set. We study objective functions which reward the regularity of the estimator function and impose or reward the agreement with the training data. In particular we consider discrete p-Laplacian and fractional Laplacian regularizations.
We investigate asymptotic behavior in the limit where the number of unlabeled points increases while the number of training points remains fixed. We uncover a delicate interplay between the regularizing nature of the functionals considered and the nonlocality inherent to the graph constructions. We rigorously obtain almost optimal ranges on the scaling of the graph connectivity radius for the asymptotic consistency to hold. The talk is based on joint works with Matthew Dunlop, Andrew Stuart, and Matthew Thorpe.
The general average distance problem, introduced by Buttazzo, Oudet, and Stepanov, asks to find a good way to approximate a high-dimensional object, represented as a measure, by a one-dimensional object. We will discuss two variants of the problem: one where the one-dimensional object is a measure with connected one-dimensional support and one where it is an embedded curve. We will present examples that show that even if the data measure is smooth the nonlocality of the functional can cause the minimizers to have corners. Nevertheless the curvature of the minimizer can be considered as a measure. We will discuss a priori estimates on the total curvature and ways to obtain information on topological complexity of the minimizers. We will furthermore discuss functionals that take the transport along the network into account and model best ways to design transportation networks. (Based on joint works with Xin Yang Lu and Slav Kirov.)
Capital Fund Management (Paris) and UCLA Applied Mathematics
Time:
Tuesday, October 17, 2017 - 11:00am
Location:
RH 306
Modern financial portfolio construction uses mean-variance optimisation that requiers the knowledge of a very large covariance matrix. Replacing the unknown covariance matrix by the sample covariance matrix (SCM) leads to disastrous out-of-sample results that can be explained by properties of large SCM understood since Marcenko and Pastur. A better estimate of the true covariance can be built by studying the eigenvectors of SCM via the average matrix resolvent. This object can be computed using a matrix generalisation of Voiculescu’s addition and multiplication of free matrices. The original result of Ledoit and Peche on SCM can be generalise to estimate any rotationally invariant matrix corrupted by additive or multiplicative noise. Note that the level of rigor of the seminar will be that of statistical physics.
Capital Fund Management (Paris) and UCLA Applied Mathematics
Time:
Tuesday, October 17, 2017 - 11:00am
Location:
RH 306
Modern financial portfolio construction uses mean-variance optimisation that requiers the knowledge of a very large covariance matrix. Replacing the unknown covariance matrix by the sample covariance matrix (SCM) leads to disastrous out-of-sample results that can be explained by properties of large SCM understood since Marcenko and Pastur. A better estimate of the true covariance can be built by studying the eigenvectors of SCM via the average matrix resolvent. This object can be computed using a matrix generalisation of Voiculescu’s addition and multiplication of free matrices. The original result of Ledoit and Peche on SCM can be generalise to estimate any rotationally invariant matrix corrupted by additive or multiplicative noise. Note that the level of rigor of the seminar will be that of statistical physics.
I will first overview the classical holomorphic isometry problem between complex manifolds, in particular between bounded symmetric domains. When the source is the unit ball, in general the characterization of holomorphic isometries to bounded symmetric domains is not quite clear. With Shan Tai Chan, we recently characterized the holomorphic isometries from the Poincare disc to the product of the unit disc with the unit ball and it provided new examples of holomorphic isometries from the Poincare disc into irreducible bounded symmetric domains of rank at least 2.
This talk will illustrate some topological properties of the space F_k(M) of ordered k-tuples of distinct points in a manifold M. For a fixed manifold M, as k increases, we might expect the topology of the configuration spaces F_k(M) to become increasingly complicated. Church and others showed, however, that when M is connected and open, there is a representation-theoretic sense in which these configuration spaces stabilize. In this talk I will explain these stability patterns, and describe a higher-order “secondary representation stability” phenomenon among the unstable homology classes. These results may be viewed as a representation-theoretic analogue of current work of Galatius–Kupers–Randal-Williams. The project is joint with Jeremy Miller.