Variational inference in neural functional prior using normalizing flows: application to differential equation and operator learning problems
作者机构:Institute of Interdisciplinary Research for Mathematics and Applied ScienceSchool of Mathematics and StatisticsHuazhong University of Science and TechnologyWuhan 430074China
出 版 物:《Applied Mathematics and Mechanics(English Edition)》 (应用数学和力学(英文版))
年 卷 期:2023年第44卷第7期
页 面:1111-1124页
核心收录:
学科分类:12[管理学] 1201[管理学-管理科学与工程(可授管理学、工学学位)] 07[理学] 081104[工学-模式识别与智能系统] 08[工学] 070104[理学-应用数学] 0835[工学-软件工程] 0811[工学-控制科学与工程] 0701[理学-数学] 0812[工学-计算机科学与技术(可授工学、理学学位)]
基 金:Project supported by the National Natural Science Foundation of China(No.12201229)
主 题:uncertainty quantification(UQ) physics-informed neural network(PINN)
摘 要:Physics-informed deep learning has recently emerged as an effective tool for leveraging both observational data and available physical ***-informed neural networks(PINNs)and deep operator networks(DeepONets)are two such *** former encodes the physical laws via the automatic differentiation,while the latter learns the hidden physics from ***,the noisy and limited observational data as well as the over-parameterization in neural networks(NNs)result in uncertainty in predictions from deep learning *** paper“MENG,X.,YANG,L.,MAO,Z.,FERRANDIS,J.D.,and KARNIADAKIS,*** functional priors and posteriors from data and *** of Computational Physics,457,111073(2022),a Bayesian framework based on the generative adversarial networks(GANs)has been proposed as a unified model to quantify uncertainties in predictions of PINNs as well as ***,the proposed approach in“MENG,X.,YANG,L.,MAO,Z.,FERRANDIS,J.D.,and KARNIADAKIS,*** functional priors and posteriors from data and *** of Computational Physics,457,111073(2022)has two stages:(i)prior learning,and(ii)posterior *** the first stage,the GANs are utilized to learn a functional prior either from a prescribed function distribution,e.g.,the Gaussian process,or from historical data and available *** the second stage,the Hamiltonian Monte Carlo(HMC)method is utilized to estimate the posterior in the latent space of ***,the vanilla HMC does not support the mini-batch training,which limits its applications in problems with big *** the present work,we propose to use the normalizing flow(NF)models in the context of variational inference(VI),which naturally enables the mini-batch training,as the alternative to HMC for posterior estimation in the latent space of GANs.A series of numerical experiments,including a nonlinear differential equation problem and a 100-dimensional(100D)Darcy problem,are conducted to demonstrate that