ppnn

PDE Preserved Neural Network

https://github.com/jx-wang-s-group/ppnn

Science Score: 75.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
    Found 3 DOI reference(s) in README
  • Academic publication links
    Links to: arxiv.org, nature.com, zenodo.org
  • Academic email domains
  • Institutional organization owner
    Organization jx-wang-s-group has institutional domain (sites.nd.edu)
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (15.3%) to scientific vocabulary
Last synced: 6 months ago · JSON representation ·

Repository

PDE Preserved Neural Network

Basic Info
  • Host: GitHub
  • Owner: jx-wang-s-group
  • License: mit
  • Language: Python
  • Default Branch: main
  • Homepage:
  • Size: 38.4 MB
Statistics
  • Stars: 50
  • Watchers: 1
  • Forks: 13
  • Open Issues: 1
  • Releases: 0
Created over 3 years ago · Last pushed 10 months ago
Metadata Files
Readme License Citation

README.md

PPNN

PDE Preserved Neural Network

Published on Communications Physics: Multi-resolution partial differential equations preserved learning framework for spatiotemporal dynamics | arxiv version

Abstract

Traditional data-driven deep learning models often struggle with high training costs, error accumulation, and poor generalizability in complex physical processes. Physics-informed deep learning (PiDL) addresses these challenges by incorporating physical principles into the model. Most PiDL approaches regularize training by embedding governing equations into the loss function, yet this depends heavily on extensive hyperparameter tuning to weigh each loss term. To this end, we propose to leverage physics prior knowledge by “baking” the discretized governing equations into the neural network architecture via the connection between the partial differential equations (PDE) operators and network structures, resulting in a PDE-preserved neural network (PPNN). This method, embedding discretized PDEs through convolutional residual networks in a multi-resolution setting, largely improves the generalizability and long-term prediction accuracy, outperforming conventional black-box models. The effectiveness and merit of the proposed methods have been demonstrated across various spatiotemporal dynamical systems governed by spatiotemporal PDEs, including reaction-diffusion, Burgers’, and Navier-Stokes equations.

structure

  • results
    • Navier-Stokes equation

      structure

Code

  • requirements ```bash pytorch numpy matplotlib tensorboard

deepxde # required by DeepONet only ``` * data generation

To generate training and testing set please refer to the code in src/operators.py.

The reference data used in the figures shown in the paper can be downloaded at DOI * training bash python src/train2D.py cases/CASE_NAME.yaml

  • src: PPNN source codes

    • opertors.py: numerical operators, is used to generate dataset. Also works as the PDE-preserving part of PPNN
    • rhs.py: define various right hand side of PDEs
    • train2D.py: the main training script for RD and burgers case. It requires config files. Examples of config files are listed in the folder cases
    • models.py: deep learning neural networks
  • case: contains yaml files that list configurations for different cases.

  • Bv: contains source code for parameterizing different boundary conditions, as disscussed in the first section in the supplementary informantion.

  • baselines: source code for the baseline methods including FNO, PINN and DeepONet

  • Problems

    If you find any bugs in the code or have trouble in running PPNN, you are very welcome to create an issue in this repository.

Citation

If you find our work relevant to your research, please cite: @article{liu2024multi, title={Multi-resolution partial differential equations preserved learning framework for spatiotemporal dynamics}, author={Liu, Xin-Yang and Zhu, Min and Lu, Lu and Sun, Hao and Wang, Jian-Xun}, journal={Communications Physics}, volume={7}, number={1}, pages={31}, year={2024}, publisher={Nature Publishing Group UK London} }

Owner

  • Name: JWang Group
  • Login: jx-wang-s-group
  • Kind: organization
  • Email: jwang33@nd.edu
  • Location: United States of America

Advance knowledge at the Interface of AI and computational physics (scientific machine learning, data assimilation, physics-informed AI, uncertainty quantificat

Citation (CITATION.bib)

@article{liu2024multi,
  title={Multi-resolution partial differential equations preserved learning framework for spatiotemporal dynamics},
  author={Liu, Xin-Yang and Zhu, Min and Lu, Lu and Sun, Hao and Wang, Jian-Xun},
  journal={Communications Physics},
  volume={7},
  number={1},
  pages={31},
  year={2024},
  publisher={Nature Publishing Group UK London}
}

GitHub Events

Total
  • Issues event: 11
  • Watch event: 23
  • Issue comment event: 10
  • Push event: 2
  • Fork event: 3
Last Year
  • Issues event: 11
  • Watch event: 23
  • Issue comment event: 10
  • Push event: 2
  • Fork event: 3

Dependencies

environment.yml conda