https://github.com/atomicarchitects/equiformer
[ICLR 2023 Spotlight] Equiformer: Equivariant Graph Attention Transformer for 3D Atomistic Graphs
Science Score: 36.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
○CITATION.cff file
-
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
○DOI references
-
✓Academic publication links
Links to: arxiv.org -
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (16.3%) to scientific vocabulary
Keywords
Repository
[ICLR 2023 Spotlight] Equiformer: Equivariant Graph Attention Transformer for 3D Atomistic Graphs
Basic Info
- Host: GitHub
- Owner: atomicarchitects
- License: mit
- Language: Python
- Default Branch: master
- Homepage: https://arxiv.org/abs/2206.11990
- Size: 3.95 MB
Statistics
- Stars: 233
- Watchers: 5
- Forks: 44
- Open Issues: 10
- Releases: 0
Topics
Metadata Files
README.md
Equiformer: Equivariant Graph Attention Transformer for 3D Atomistic Graphs
Paper | OpenReview | Poster | Slides
This repository contains the official PyTorch implementation of the work "Equiformer: Equivariant Graph Attention Transformer for 3D Atomistic Graphs" (ICLR 2023 Spotlight).
Additionally, in our subsequent work, we find that we can generalize self-supervised learning similar to BERT, which we call DeNS (Denoising Non-Equilibrium Structures), to 3D atomistic systems to improve the performance of Equiformer on MD17 dataset. We provide the implementation of training Equiformer with DeNS on MD17 below. Please refer to the paper and the code for further details.
Content
Environment Setup
Environment
See here for setting up the environment.
QM9
The dataset of QM9 will be automatically downloaded when running training.
MD17
The dataset of MD17 will be automatically downloaded when running training.
OC20
The dataset for different tasks can be downloaded by following instructions in their GitHub repository, which was previously this link.
After downloading, place the datasets under datasets/oc20/ by using ln -s.
Take is2re as an example:
bash
cd datasets
mkdir oc20
cd oc20
ln -s ~/ocp/data/is2re is2re
Training
QM9
We provide training scripts under
scripts/train/qm9/equiformer. For example, we can train Equiformer for the task ofalphaby running:bash sh scripts/train/qm9/equiformer/target@1.shThe QM9 dataset will be downloaded automatically as we run training for the first time.
The target number for different regression tasks can be found here.
We also provide the code for training Equiformer with linear messages and dot product attention. To train Equiformer with linear messages, replace
--model-name 'graph_attention_transformer_nonlinear_l2'with--model-name 'graph_attention_transformer_l2'in training scripts.The training scripts for Equiformer with linear messages and dot product attention can be found in
scripts/train/qm9/dp_equiformer.Training logs of Equiformer can be found here.
MD17
We provide training scripts under
scripts/train/md17/equiformer. For example, we can train Equiformer for the molecule ofaspirinby running:bash sh scripts/train/md17/equiformer/se_l2/target@aspirin.sh # L_max = 2 sh scripts/train/md17/equiformer/se_l3/target@aspirin.sh # L_max = 3Training logs of Equiformer with $L{max} = 2$ and $L{max} = 3$ can be found here ($L{max} = 2$) and here ($L{max} = 3$). Note that the units of energy and force are kcal mol $^{-1}$ and kcal mol $^{-1}$ Å $^{-1}$ and that we report energy and force in units of meV and meV Å $^{-1}$ in the paper.
We provide the scripts of training Equiformer with DeNS (Denoising Non-Equilibrium Structures) under
scripts/train/md17/equiformer_dens. For example, we train Equiformer with DeNS for the molecule ofaspirinby running:bash sh scripts/train/md17/equiformer_dens/se_l2/target@aspirin.sh # L_max = 2 sh scripts/train/md17/equiformer_dens/se_l3/target@aspirin.sh # L_max = 3The logs of training Equiformer with $L{max} = 2$ and $L{max} = 3$ with DeNS can be found here ($L{max} = 2$) and here ($L{max} = 3$). Note that the units of energy and force are kcal mol $^{-1}$ and kcal mol $^{-1}$ Å $^{-1}$ and that we report energy and force in units of meV and meV Å $^{-1}$ in the paper.
OC20
We train Equiformer on IS2RE data only by running:
bash sh scripts/train/oc20/is2re/graph_attention_transformer/l1_256_nonlinear_split@all_g@2.sha. This requires 2 GPUs and results in energy MAE of around 0.5088 eV for the ID sub-split of the validation set.
b. Pretrained weights and training logs can be found here.
We train Equiformer on IS2RE data with IS2RS auxiliary task and Noisy Nodes data augmentation by running:
bash sh scripts/train/oc20/is2re/graph_attention_transformer/l1_256_blocks@18_nonlinear_aux_split@all_g@4.sha. This requires 4 GPUs and results in energy MAE of around 0.4156 eV for the ID sub-split of the validation set.
b. Pretrained weights and training logs can be found here.
File Structure
We have different files and models for QM9, MD17 and OC20.
General
netsincludes code of different network architectures for QM9, MD17 and OC20.scriptsincludes scripts for training models for QM9, MD17 and OC20.
QM9
main_qm9.pyis the training code for QM9 dataset.
MD17
main_md17.pyis the code for training and evaluation on MD17 dataset.main_md17_dens.pyextendsmain_md17.pyso that we can train Equiformer with DeNS.
OC20
Some differences are made to support:
- Removing weight decay for certain parameters specified by no_weight_decay. One example is here.
- Cosine learning rate.
main_oc20.pyis the code for training and evaluating.oc20/trainercontains the code for energy trainers.oc20/configscontains the config files for IS2RE.
Acknowledgement
Our implementation is based on PyTorch, PyG, e3nn, timm, ocp, SEGNN and TorchMD-NET.
Citation
If you use our code or method in your work, please consider citing the following:
bibtex
@inproceedings{
liao2023equiformer,
title={Equiformer: Equivariant Graph Attention Transformer for 3D Atomistic Graphs},
author={Yi-Lun Liao and Tess Smidt},
booktitle={International Conference on Learning Representations},
year={2023},
url={https://openreview.net/forum?id=KwmPfARgOTD}
}
If DeNS is helpful to your work, please consider citing the following as well:
bibtex
@article{
DeNS,
title={Generalizing Denoising to Non-Equilibrium Structures Improves Equivariant Force Fields},
author={Yi-Lun Liao and Tess Smidt and Muhammed Shuaibi* and Abhishek Das*},
journal={arXiv preprint arXiv:2403.09549},
year={2024}
}
Please direct any questions to Yi-Lun Liao (ylliao@mit.edu).
Owner
- Name: The Atomic Architects
- Login: atomicarchitects
- Kind: organization
- Location: United States of America
- Website: https://atomicarchitects.github.io/
- Twitter: AtomArchitects
- Repositories: 2
- Profile: https://github.com/atomicarchitects
Research Group of Prof. Tess Smidt
GitHub Events
Total
- Issues event: 7
- Watch event: 43
- Issue comment event: 10
- Push event: 2
- Fork event: 7
Last Year
- Issues event: 7
- Watch event: 43
- Issue comment event: 10
- Push event: 2
- Fork event: 7