Speaker: 

Jack Xin

Institution: 

UCI

Time: 

Tuesday, April 30, 2024 - 3:00pm to 4:00pm

Location: 

440R

Multiscale time dependent partial differential equations (PDE) are challenging to  compute by traditional mesh based methods especially when their solutions develop  large gradients or concentrations at unknown locations.  Particle methods, based on microscopic aspects of the PDEs, are mesh free and self-adaptive,  yet still expensive when a long time or a resolved computation is necessary. 

We present DeepParticle, an integrated deep learning, optimal transport (OT),  and interacting particle (IP) approach, to speed up  generation and prediction of PDE dynamics through two case studies on transport  in fluid flows with chaotic streamlines:  

1) large time front speeds of Fisher-Kolmogorov-Petrovsky-Piskunov equation (FKPP); 

2) Keller-Segel (KS) chemotaxis system modeling bacteria evolution in the presence of a chemical attractant. 

Analysis of FKPP reduces the problem to a computation of principal eigenvalue of an advection-diffusion operator. A normalized Feynman-Kac representation  makes possible a genetic IP algorithm to  evolve the initial uniform particle distribution to a large time invariant measure  from which to extract front speeds. The invariant measure is parameterized  by a physical parameter (the Peclet number). We train a light weight deep neural network with local and global skip connections  to learn this family of invariant measures. The training data come  from IP computation in three dimensions at a few sample Peclet numbers. 

The training objective being minimized is a discrete Wasserstein distance in OT theory. The trained network predicts a more concentrated invariant measure at a larger Peclet number  and also serves as a warm start to accelerate IP computation. The KS is formulated as a McKean-Vlasov equation (macroscopic limit) of a stochastic IP system. The DeepParticle framework extends and  learns to generate various finite time bacterial aggregation patterns.