calm-pde

Official PyTorch implementation of the CALM layers and CALM-PDE model.

https://github.com/jhagnberger/calm-pde

Science Score: 54.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
    Links to: arxiv.org
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (11.1%) to scientific vocabulary

Keywords

deep-learning machine-learning neural-pde-solver simulations-physics
Last synced: 6 months ago · JSON representation ·

Repository

Official PyTorch implementation of the CALM layers and CALM-PDE model.

Basic Info
  • Host: GitHub
  • Owner: jhagnberger
  • License: mit
  • Language: Python
  • Default Branch: main
  • Homepage:
  • Size: 471 KB
Statistics
  • Stars: 2
  • Watchers: 1
  • Forks: 0
  • Open Issues: 0
  • Releases: 0
Topics
deep-learning machine-learning neural-pde-solver simulations-physics
Created 10 months ago · Last pushed 9 months ago
Metadata Files
Readme License Citation

README.md

CALM-PDE Logo

CALM-PDE: Continuous and Adaptive Convolutions for Latent Space Modeling of Time-dependent PDEs

Jan Hagnberger, Daniel Musekamp, Mathias Niepert

This repository contains the official PyTorch implementation of the CALM-PDE model from the paper, "CALM-PDE: Continuous and Adaptive Convolutions for Latent Space Modeling of Time-dependent PDEs".

🛠️ Requirements

The CALM-PDE model requires and is tested with the following packages. - PyTorch in version 2.5.1 - NumPy in version 2.1.3 - Optimized Einsum in version 3.4.0 - Einops in version 0.8.0

Please also see the environment.yml file, which contains all packages to run the provided examples.

🚀 Usage of CALM Layers

The file calm_pde/layer_examples.py contains several examples demonstrating the use of CALM layers. The example below shows how a CALM layer can be used to encode a 2D point cloud with 4096 points into a latent point cloud with 512 points and 16 channels.

```python import torch from models.layers import CALMEncoderLayer device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')

encoder = CALMEncoderLayer(inchannels=2, outchannels=16, numquerypoints=512, receptivefield=0.1, softmaxtemp=1.0, spatialdim=2, isperiodic=False, initquerypos=None, dropout=0.0).to(device)

Create random point cloud in 2D with the same coordinates for each sample

features = torch.rand(4, 4096, 2, device=device) # (b, n, c) mesh = torch.rand(4096, 2, device=device) # (n, d) features_new, positions = encoder(features, mesh) # (4, 512, 16), (512, 2) ```

🤖 Usage of CALM-PDE Models

The file calm_pde/model_examples.py contains several examples demonstrating the use of CALM-PDE models. The example below shows how the CALM-PDE model can be used for a 2D PDE such as Navier-Stokes sampled on a regular mesh.

```python import torch from models.dynamicsmodels import NeuralODE from models.models2d import CALMPDEVorticity device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')

Using the CALM-PDE model for a 2D PDE such as Navier-Stokes. We assume that

each sample has the same mesh and that the mesh is periodic.

processor = NeuralODE(128, 256, nheads=8, isperiodic=True) model = CALMPDEVorticity(processor, inchannels=1, outchannels=1, dropout=0.0).to(device)

Create random 2D data

u0 = torch.rand(4, 4096, 1, device=device) # (b, n, c) inputmesh = torch.rand(4096, 2, device=device) # (n, d) outputmesh = torch.rand(4096, 2, device=device) # (n, d) rolloutsteps = 20 u = model.forward(u0, inputmesh, outputmesh, rolloutsteps) # (4, 20, 4096, 1) ```

💾 Datasets

We use the following datasets in our experiments.

You can either download the datasets directly from the resources linked above or use the provided scripts in 📂 datasets to download the data files.

🔬 Experiments

To reproduce the results reported in the paper, please follow the instructions below.

1. PDEBench 1D Burgers' Equation

  1. Download the dataset with sh datasets/download_pdebench_datasets.sh 1d_burgers {LOCAL_DIR}
  2. Set data_path: "{LOCAL_DIR}/pdebench_datasets/" in the configuration file calm_pde/config/1d_pdebench_burgers.sh
  3. Move to the directory 📂 bash and run the experiment with sh pdebench_burgers.sh or submit it to SLURM with sbatch pdebench_burgers.sh

2. FNO 2D Vorticity Datasets

  1. Download the preprocessed datasets with sh datasets/download_fno_datasets.sh 1e-4 {LOCAL_DIR}
  2. Set data_path: "{LOCAL_DIR}/fno_vorticity_datasets/" in the file calm_pde/config/2d_fno_1e-4.yaml
  3. Move to the directory 📂 bash and run the experiment with sh fno_votiticy_1e-4.sh or submit it to SLURM with sbatch fno_votiticy_1e-4.sh

3. PDEBench 3D Compressible Navier-Stokes Equation

  1. Download the dataset with sh datasets/download_pdebench_datasets.sh 3d_navier_stokes {LOCAL_DIR}
  2. Set data_path: "{LOCAL_DIR}/pdebench_datasets/" in the file calm_pde/config/3d_pdebench_cns.yaml
  3. Move to the directory 📂 bash and run the experiment with sh pdebench_3d_cns.sh or submit it to SLURM with sbatch pdebench_3d_cns.sh

4. MeshGraphNets 2D Airfoil Dataset

  1. Download the preprocessed dataset with sh datasets/download_preprocessed_meshgraphnets_datasets.sh airfoil {LOCAL_DIR}
  2. Set data_path: "{LOCAL_DIR}/meshgraphnets_datasets/airfoil/" in the file calm_pde/config/2d_airfoil_time.yaml
  3. Move to the directory 📂 bash and run the experiment with sh airfoil.sh or submit it to SLURM with sbatch airfoil.sh

5. MeshGraphNets 2D Cylinder Dataset

  1. Download the preprocessed dataset with sh datasets/download_preprocessed_meshgraphnets_datasets.sh cylinder_flow {LOCAL_DIR}
  2. Set data_path: "{LOCAL_DIR}/meshgraphnets_datasets/cylinder_flow/" in the file calm_pde/config/2d_cylinder_time.yaml
  3. Move to the directory 📂 bash and run the experiment with sh cylinder.sh or submit it to SLURM with sbatch cylinder.sh

🏗️ CALM-PDE Architecture

The following illustration shows the architecture of the CALM-PDE model for solving 2D time-dependent PDEs (e.g., incompressible Navier-Stokes equations with a cylinder geometry). CALM-PDE consists of three key components:

  • Encoder: Encodes the arbitrarily discretized PDE solution into a fixed latent space
  • Processor: Computes the latent representation of future timesteps via latent time-stepping
  • Decoder: Decodes the latent representation back to the spatial domain for queried coordinates

CALM-PDE Architecture

🧳 Directory Tour

Below is a listing of the directory structure of the CALM-PDE repository.

📂 calm_pde: Contains the code for the CALM-PDE model and the experiments. \ 📂 bash: Contains bash scripts to run the experiments. \ 📂 datasets: Contains bash scripts to download the used datasets.

🌟 Acknowledgements

We would like to thank the authors of PDEBench, FNO, and MeshGraphNets for their datasets, which we have used in our experiments. Furthermore, we would like to thank the authors of PDEBench for their benchmark framework, which served as inspiration for our codebase.

⚖️ License

MIT licensed, except where otherwise stated. Please see LICENSE file.

✏️ Citation

If you find our project useful, please consider citing it.

@misc{calm-pde-hagnberger:2025, title={{CALM-PDE}: Continuous and {A}daptive {C}onvolutions for {L}atent {S}pace {M}odeling of {T}ime-dependent {PDE}s}, author={Jan Hagnberger and Daniel Musekamp and Mathias Niepert}, year={2025}, eprint={2505.12944}, archivePrefix={arXiv}, primaryClass={cs.LG}, url={https://arxiv.org/abs/2505.12944}, }

Owner

  • Name: Jan Hagnberger
  • Login: jhagnberger
  • Kind: user
  • Location: Stuttgart, Germany
  • Company: University of Stuttgart

Machine Learning for Science

Citation (CITATION.bib)

@misc{calm-pde-hagnberger:2025,
      title={{CALM-PDE}: Continuous and {A}daptive {C}onvolutions for {L}atent {S}pace {M}odeling of {T}ime-dependent {PDE}s}, 
      author={Jan Hagnberger and Daniel Musekamp and Mathias Niepert},
      year={2025},
      eprint={2505.12944},
      archivePrefix={arXiv},
      primaryClass={cs.LG},
      url={https://arxiv.org/abs/2505.12944}, 
}

GitHub Events

Total
  • Watch event: 8
  • Push event: 3
  • Public event: 1
Last Year
  • Watch event: 8
  • Push event: 3
  • Public event: 1

Dependencies

environment.yml pypi
  • antlr4-python3-runtime ==4.9.3
  • bokeh ==3.6.1
  • click ==8.1.7
  • contourpy ==1.3.1
  • cycler ==0.12.1
  • decorator ==4.4.2
  • docker-pycreds ==0.4.0
  • einops ==0.8.0
  • fonttools ==4.55.0
  • fsspec ==2024.10.0
  • gitdb ==4.0.11
  • gitpython ==3.1.43
  • h5py ==3.12.1
  • hydra-core ==1.3.2
  • imageio ==2.36.0
  • imageio-ffmpeg ==0.5.1
  • joblib ==1.4.2
  • kiwisolver ==1.4.7
  • matplotlib ==3.9.2
  • moviepy ==1.0.3
  • omegaconf ==2.3.0
  • opt-einsum ==3.4.0
  • packaging ==24.2
  • pandas ==2.2.3
  • pillow ==10.4.0
  • platformdirs ==4.3.6
  • plotly ==5.24.1
  • proglog ==0.1.10
  • protobuf ==5.28.3
  • psutil ==6.1.0
  • pyparsing ==3.2.0
  • python-dateutil ==2.9.0.post0
  • python-dotenv ==1.0.1
  • pytz ==2024.2
  • rdkit ==2024.3.6
  • scikit-learn ==1.6.1
  • scipy ==1.14.1
  • seaborn ==0.13.2
  • sentry-sdk ==2.19.0
  • setproctitle ==1.3.4
  • six ==1.16.0
  • smmap ==5.0.1
  • soundfile ==0.12.1
  • sympy ==1.13.1
  • tenacity ==9.0.0
  • threadpoolctl ==3.5.0
  • torchdiffeq ==0.2.5
  • tornado ==6.4.2
  • tqdm ==4.67.1
  • tzdata ==2024.2
  • wandb ==0.18.7
  • xyzservices ==2024.9.0