Speaker: 

Chris Rackauckas

Institution: 

MIT

Time: 

Friday, November 8, 2019 - 11:00am to 12:00pm

Host: 

Location: 

NS II Room 4201

Scientific Machine Learning (SciML) is an emerging discipline which merges the mechanistic models of science and engineering with non-mechanistic machine learning models to solve problems which were previously intractable. Recent results have showcased how methods like Physics Informed Neural Networks (PINNs) can be utilized as a data-efficient learning method, embedding the structure of physical laws or biological knowledge as a prior into a learnable structures so that small data and neural networks can sufficiently predict phenomena. Additionally, deep learning embedded within backwards stochastic differential equations has been shown to be an effective tool for solving high-dimensional partial differential equations, like the Hamilton-Jacobian-Bellman equation with 1000 dimensions. In this talk we will introduce the audience to these methods and show how these diverse methods are all instantiations of a neural differential equation, a differential equation where all or part of the equation is described by a latent neural network. Problems such as optimal control and automated learning of differential equation models will be reduced to training problems on neural differential equations. Once this is realized, we will show how a computational tool, DiffEqFlux.jl, is being optimized to allow for efficient training of a wide variety of neural differential equations, explaining how the performance properties of these equation differ from more traditional uses of differential equations and some of the early results of optimizing for this domain. The audience will leave knowing how neural differential equations and DiffEqFlux.jl may be a vital part of next-generation scientific tooling.