https://github.com/google-deepmind/flows_for_atomic_solids

https://github.com/google-deepmind/flows_for_atomic_solids

Science Score: 23.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
  • .zenodo.json file
  • DOI references
    Found 1 DOI reference(s) in README
  • Academic publication links
    Links to: iop.org
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (13.5%) to scientific vocabulary
Last synced: 5 months ago · JSON representation

Repository

Basic Info
  • Host: GitHub
  • Owner: google-deepmind
  • License: apache-2.0
  • Language: Python
  • Default Branch: main
  • Size: 78.1 KB
Statistics
  • Stars: 44
  • Watchers: 6
  • Forks: 2
  • Open Issues: 0
  • Releases: 0
Archived
Created almost 4 years ago · Last pushed over 3 years ago
Metadata Files
Readme Contributing License

README.md

Flows for atomic solids

The code in this repository can be used to train normalizing flow models to generate samples of atomic solids, as described in our paper Normalizing flows for atomic solids. It also contains a Colab notebook that loads parameters of already trained models and samples from them, plotting observables similar to the figures in the paper.

Installation and usage

Structure of the code

The code is organized in the following folders:

  • colab: contains a Colab notebook to explore the samples from pre-trained models.
  • experiments: configuration files for Lennard-Jones and monatomic water experiments, and the script to run training on them.
  • models: modules to build normalizing flow models.
  • systems: definitions of the Lennard-Jones and monatomic water potentials used to train the models.
  • tutorial: contains a Colab notebook to train a small model from scratch on the 8-particle monatomic water system with some code to be implemented.
  • utils: utilities for building lattices and computing observables from the model samples.

Training a model

Python version >= 3.7 is required to install and run the code.

To train one of the normalizing flows described in the paper, first clone the deepmind-research repository in a folder of your choice:

shell git clone https://github.com/deepmind/flows_for_atomic_solids.git

Set up a Python virtual environment with the required dependencies by running the run.sh script. This will also test-run the training script to make sure the installation succeeded.

shell source ./flows_for_atomic_solids/run.sh

Then run the experiments/train.py script, selecting one of the pre-configured systems:

shell python -m flows_for_atomic_solids.experiments.train --system='lj_32'

Please note that a GPU is necessary to train the larger models.

Exploring a pre-trained model

Open In Colab

The Colab notebook colab/explore_trained_models.ipynb can be used to access parameters of a set of models that have been trained as described in the paper. The Colab will load the model and reproduce the energy, radial distribution and work figures, as well as compute a free-energy estimate. A Colab runtime with GPU or TPU accelerator is recommended.

Disclaimer

This is not an official Google product.

Owner

  • Name: Google DeepMind
  • Login: google-deepmind
  • Kind: organization

GitHub Events

Total
  • Watch event: 6
Last Year
  • Watch event: 6

Dependencies

requirements.in pypi
  • absl-py >=0.13.0
  • chex >=0.1.1
  • distrax >=0.1.1
  • dm-haiku >=0.0.6
  • dm-tree >=0.1.6
  • jax >=0.3.2
  • jaxlib >=0.3.2
  • ml_collections >=0.1.1
  • numpy >=1.21.5
  • optax >=0.1.0
  • tensorflow-probability >=0.15.0
requirements.txt pypi
  • absl-py ==1.3.0
  • chex ==0.1.5
  • cloudpickle ==2.2.0
  • contextlib2 ==21.6.0
  • decorator ==5.1.1
  • distrax ==0.1.2
  • dm-haiku ==0.0.8
  • dm-tree ==0.1.7
  • etils ==0.8.0
  • gast ==0.5.3
  • importlib-resources ==5.10.0
  • jax ==0.3.23
  • jaxlib ==0.3.22
  • jmp ==0.0.2
  • ml-collections ==0.1.1
  • numpy ==1.23.4
  • opt-einsum ==3.3.0
  • optax ==0.1.3
  • pyyaml ==6.0
  • scipy ==1.9.3
  • six ==1.16.0
  • tabulate ==0.9.0
  • tensorflow-probability ==0.18.0
  • toolz ==0.12.0
  • typing-extensions ==4.4.0
  • zipp ==3.10.0