High dimensional partial differential equations (PDE) are challenging to compute by traditional mesh-based methods especially when their solutions have large gradients or concentrations at unknown locations. Mesh-free methods are more appealing; however, they remain slow and expensive when a long time and resolved computation is necessary. In this talk, we present DeepParticle, an integrated deep learning (DL), optimal transport (OT), and interacting particle (IP) approach through a case study of Fisher-Kolmogorov-Petrovsky-Piskunov front speeds in incompressible flows. PDE analysis reduces the problem to the computation of the principal eigenvalue of an advection-diffusion operator. Stochastic representation via the Feynman-Kac formula makes possible a genetic interacting particle algorithm that evolves particle distribution to a large time-invariant measure from which the front speed is extracted. The invariant measure is parameterized by a physical parameter (the Peclet number). We learn this family of invariant measures by training a physically parameterized deep neural network on affordable data from IP computation at moderate Peclet numbers, then predict at a larger Peclet number when IP computation is expensive. Our methodology extends to a more general context of deep learning stochastic particle dynamics. For instance, we can learn and generate aggregation patterns in Keller-Segel chemotaxis systems.
The University of Hong Kong