报告题目:GPT-PINN: Generative Pre-Trained Physics-Informed Neural Networks toward non-intrusive Meta-learning of parametric PDEs
报告人:Yanlai Chen,UMass Dartmouth
报告时间:7月7日 下午2:00-3:00
报告地点:管理楼1418
摘要:
Physics-Informed Neural Network (PINN) has proven itself a powerful tool to obtain the numerical solutions of nonlinear partial differential equations (PDEs) leveraging the expressivity of deep neural networks and the computing power of modern heterogeneous hardware. However, its training is still time-consuming, especially in the multi-query and real-time simulation settings, and its parameterization often overly excessive.
In this talk, we present the recently proposed Generative Pre-Trained PINN (GPT-PINN). It mitigates both challenges in the setting of parametric PDEs. GPT-PINN represents a brand-new meta-learning paradigm for parametric systems. As a network of networks, its outer-/meta-network is hyper-reduced with only one hidden layer having significantly reduced number of neurons. Moreover, its activation function at each hidden neuron is a (full) PINN pre-trained at a judiciously selected system configuration. The meta-network adaptively “learns” the parametric dependence of the system and “grows” this hidden layer one neuron at a time. In the end, by encompassing a very small number of networks trained at this set of adaptively-selected parameter values, the meta-network is capable of generating surrogate solutions for the parametric system across the entire parameter domain accurately and efficiently.
报告人简介:
Yanlai Chen博士, 美国麻省大学达特茅斯分校 (University of Massachusetts Dartmouth) 数学系终身教授, 工程与应用科学研究生项目主任。 2002 本科毕业于我校获学士学位, 2007年获得美国明尼苏达大学应用数学博士和计算机科学硕士学位后在布朗大学从事了3年博士后研究。
陈教授的研究领域包括大规模高性能科学计算的算法设计和分析。 他主持了美国国家科学基金会关于快速算法的三项专项科研基金。他还积极支持低收入学生, 包括主持美国国家科学基金会一项集教育改革和支持低收入学生于一体的STEM教育基金。陈教授已指导四位博士生毕业。