Physics-Informed Neural Networks for parameter estimation in SDEs

Work with Sophie Donnet, Hugo Gangloff, Nicolas Jouvin
pinns
sde
generative models
spatial ecology
Author

Lucia Clarotto

Physics-Informed Neural Networks for SDEs

Stochastic Differential Equations (SDEs) are popular models in many fields including spatial ecology (Michelot et al. 2019), climate science (Ditlevsen and Ditlevsen 2023) and biology (Degond, Herda, and Mirrahimi 2020). Diffusion SDEs with additive noise are commonly found and defined as in Øksendal (2003): \[\mathrm{d}X_t=F(X_t;\beta)\, \mathrm{d}t+\Sigma\, \mathrm{d}W_t, \quad X_0\sim p_0\] where \((X_t)_t \in\mathbb{R}^d\) is the stochastic process of interest, \(W_t\) is a \(d\)-dimensional Brownian noise, \(p_0\) is the initial distribution, \(\beta\) parametrizes the drift of the equation and \(\Sigma\) is the diffusion coefficient.

When proposing such a model for observed trajectories at discrete times \((x_{t_1}, \dots, x_{t_n})\), the next step consists in estimating the parameter \(\theta=\{\beta,\Sigma\}\) from those observed data. This is a critical task from which one can gain understanding on the underlying process mechanics.

One classical parameter estimation approach is that of maximum likelihood. In some rare cases, when the SDE is such that the likelihood of the observations can be computed explicitly as a function of the parameters \(\theta\), the estimation then resorts to a classical estimation task. However, in many cases this approach is not possible, and a classical procedure is to locally linearize the EDS. Many approaches have been proposed over the last decades, all with strengths and weaknesses (Pilipovic, Samson, and Ditlevsen 2024).

To the extent of our knowledge, another feature of SDEs has been under-used in the estimation context. Indeed, let \(p(x,t;\theta)\) be the density function of \(X_t\) for a given set of parameters \(\theta\). The behavior of \(p(x,t;\theta)\) is described by the Fokker-Planck Equation (FPE) (Risken 1989), which is the Partial Differential Equation (PDE) defined by \[\begin{eqnarray*} \label{eq:fpe} \frac{\partial p(x,t;\theta)}{\partial t}&=&-\nabla\cdot(F(x;\beta)p(x,t;\theta))+\frac{1}{2}\nabla\cdot\left(\Sigma\Sigma^\top\nabla p(x,t;\theta)\right),\\ p(\cdot ,0;\theta)&=&p_0 \end{eqnarray*}\] where \(\nabla\) and \(\nabla \cdot\) denotes the gradient and divergence operators. Thus, solving this PDE would provide an implicit expression of the marginal likelihood of each observation \(x_{t_i}\), which is a first step towards the maximum likelihood estimation of \(\theta\).

In the past few years, the emergence of Physics-Informed Neural Networks (PINNs) (Raissi, Perdikaris, and Karniadakis 2019) has led to a fundamental rethinking of traditional approaches to solving partial differential equations. In a few words, the PINNs approach seeks to find the best neural network \(u_\nu\) (\(\nu\) being the set of weights and biases) representing the solution of the PDE in the form \(\mathcal{N}_\theta[u]=0\), where \(\mathcal{N}_\theta\) is an arbitrary differential operator, by minimizing its residuals computed at randomly sampled collocation points, in a so-called forward problem. This mesh-less approach has proven useful in a variety of contexts. It can also be extended to inverse problems where one seeks to learn the differential operator’s parameters \(\theta\) given some observations of the solution \(p(x_i,t_j;\theta)\), thus offering a flexible way to incorporate available “data” in the training.

Two additional difficulties arise in the context of this internship:

  1. First, \(p(x,t;\theta)\) being a density function, the PINN is expected to learn a normalized probability density, hence one must ensure that, for any \(t\), \(\int_{\Omega}p(x,t;\theta )\mathrm{d}x=1\).
  2. Second, we do not observe the solution itself but realisations of the SDE at discrete time points, whose marginal distribution is the solution of the PDE.

A recent line of research uses PINNs for simulation or parameter estimation in SDEs via their FPE (Feng, Zeng, and Zhou 2021; Chen et al. 2020; Liu, Wu, and Zhang 2023), as we have just described. In this context, building on the previous articles, we explore the connection between SDEs, their FPE, and the Physics-Informed Neural Network (PINN) methodology.

M2 internship 2026

We have an open position for a M2 student’s 6-month internship starting in 2026. The internship aims at proposing an efficient neural network architecture and optimisation scheme to accurately solve a FPE (forward problem) and perform parameter estimation (inverse problem) by using observational data that are assumed to be generated by the corresponding SDE. All the details are available here.


Chen, Xiaoli, Liu Yang, Jinqiao Duan, and George Em Karniadakis. 2020. “Solving Inverse Stochastic Problems from Discrete Particle Observations Using the Fokker-Planck Equation and Physics-Informed Neural Networks.” https://arxiv.org/abs/2008.10653.
Degond, Pierre, Maxime Herda, and Sepideh Mirrahimi. 2020. A Fokker-Planck approach to the study of robustness in gene expression.” Mathematical Biosciences and Engineering 17 (6): 6459–86. https://doi.org/10.3934/mbe.2020338.
Ditlevsen, Peter, and Susanne Ditlevsen. 2023. “Warning of a Forthcoming Collapse of the Atlantic Meridional Overturning Circulation.” Nature Communications 14 (1): 1–12.
Feng, Xiaodong, Li Zeng, and Tao Zhou. 2021. “Solving Time Dependent Fokker-Planck Equations via Temporal Normalizing Flow.” arXiv Preprint arXiv:2112.14012.
Liu, Feng, Faguo Wu, and Xiao Zhang. 2023. “PINF: Continuous Normalizing Flows for Physics-Constrained Deep Learning.” https://arxiv.org/abs/2309.15139.
Michelot, Théo, Pierre Gloaguen, Paul G. Blackwell, and Marie-Pierre Etienne. 2019. “The Langevin Diffusion as a Continuous-Time Model of Animal Movement and Habitat Selection.” Methods in Ecology and Evolution 10 (11). https://doi.org/https://doi.org/10.1111/2041-210X.13275.
Øksendal, Bernt. 2003. “Stochastic Differential Equations.” In Stochastic Differential Equations: An Introduction with Applications, 38–50. Springer.
Pilipovic, Predrag, Adeline Samson, and Susanne Ditlevsen. 2024. “Parameter Estimation in Nonlinear Multivariate Stochastic Differential Equations Based on Splitting Schemes.” The Annals of Statistics 52 (2): 842–67.
Raissi, Maziar, Paris Perdikaris, and George E Karniadakis. 2019. “Physics-Informed Neural Networks: A Deep Learning Framework for Solving Forward and Inverse Problems Involving Nonlinear Partial Differential Equations.” Journal of Computational Physics 378: 686–707.
Risken, Hannes. 1989. “Fokker-Planck Equation.” In The Fokker-Planck Equation: Methods of Solution and Applications, 63–95. Springer.