https://github.com/microsoft/mattergen

Official implementation of MatterGen -- a generative model for inorganic materials design across the periodic table that can be fine-tuned to steer the generation towards a wide range of property constraints.

https://github.com/microsoft/mattergen

Science Score: 49.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
    Found 1 DOI reference(s) in README
  • Academic publication links
    Links to: arxiv.org, nature.com
  • Committers with academic emails
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (14.9%) to scientific vocabulary

Keywords

generative-ai materials-design materials-science

Keywords from Contributors

transformer agent vlms cryptocurrency large-language-model multi-agent document-parser sft anthropic langchain
Last synced: 7 months ago · JSON representation

Repository

Official implementation of MatterGen -- a generative model for inorganic materials design across the periodic table that can be fine-tuned to steer the generation towards a wide range of property constraints.

Basic Info
Statistics
  • Stars: 1,464
  • Watchers: 28
  • Forks: 256
  • Open Issues: 6
  • Releases: 4
Topics
generative-ai materials-design materials-science
Created over 1 year ago · Last pushed 8 months ago
Metadata Files
Readme Contributing License Code of conduct Security

README.md

MatterGen logo

[![DOI](https://img.shields.io/badge/DOI-10.1038%2Fs41586--025--08628--5-blue)](https://www.nature.com/articles/s41586-025-08628-5) [![arXiv](https://img.shields.io/badge/arXiv-2312.03687-blue.svg?logo=arxiv&logoColor=white.svg)](https://arxiv.org/abs/2312.03687) [![Requires Python 3.10+](https://img.shields.io/badge/Python-3.10+-blue.svg?logo=python&logoColor=white)](https://python.org/downloads)

MatterGen is a generative model for inorganic materials design across the periodic table that can be fine-tuned to steer the generation towards a wide range of property constraints.

Table of Contents

Installation

The easiest way to install prerequisites is via uv, a fast Python package and project manager.

The MatterGen environment can be installed via the following command (assumes you are running Linux and have a CUDA GPU): bash pip install uv uv venv .venv --python 3.10 source .venv/bin/activate uv pip install -e .

Note that our datasets and model checkpoints are provided inside this repo via Git Large File Storage (LFS). To find out whether LFS is installed on your machine, run bash git lfs --version If this prints some version like git-lfs/3.0.2 (GitHub; linux amd64; go 1.18.1), you can skip the following step.

Install Git LFS

If Git LFS was not installed before you cloned this repo, you can install it via: bash sudo apt install git-lfs git lfs install

Apple Silicon

[!WARNING] Running MatterGen on Apple Silicon is experimental. Use at your own risk.
Further, you need to run export PYTORCH_ENABLE_MPS_FALLBACK=1 before any training or generation run.

Get started with a pre-trained model

We provide checkpoints of an unconditional base version of MatterGen as well as fine-tuned models for these properties: * mattergen_base: unconditional base model trained on Alex-MP-20 * mp_20_base: unconditional base model trained on MP-20 * chemical_system: fine-tuned model conditioned on chemical system * space_group: fine-tuned model conditioned on space group * dft_mag_density: fine-tuned model conditioned on magnetic density from DFT * dft_band_gap: fine-tuned model conditioned on band gap from DFT * ml_bulk_modulus: fine-tuned model conditioned on bulk modulus from ML predictor * dft_mag_density_hhi_score: fine-tuned model jointly conditioned on magnetic density from DFT and HHI score * chemical_system_energy_above_hull: fine-tuned model jointly conditioned on chemical system and energy above hull from DFT

The checkpoints are located at checkpoints/<model_name> and are also available on Hugging Face. By default, they are downloaded from Huggingface when requested. You can also manually download them from Git LFS via bash git lfs pull -I checkpoints/<model_name> --exclude=""

[!NOTE] The checkpoints provided were re-trained using this repository, i.e., are not identical to the ones used in the paper. Hence, results may slightly deviate from those in the publication.

Generating materials

Unconditional generation

To sample from the pre-trained base model, run the following command. ```bash export MODELNAME=mattergenbase export RESULTS_PATH=results/ # Samples will be written to this directory

generate batchsize * numbatches samples

mattergen-generate $RESULTSPATH --pretrained-name=$MODELNAME --batchsize=16 --numbatches 1 `` This script will write the following files into$RESULTSPATH: *generatedcrystalscif.zip: a ZIP file containing a single.ciffile per generated structure. *generatedcrystals.extxyz, a single file containing the individual generated structures as frames. * If--record-trajectories == True(default):generated_trajectories.zip: a ZIP file containing a.extxyz` file per generated structure, which contains the full denoising trajectory for each individual structure.

[!TIP] For best efficiency, increase the batch size to the largest your GPU can sustain without running out of memory.

[!NOTE] To sample from a model you've trained yourself, replace --pretrained-name=$MODEL_NAME with --model_path=$MODEL_PATH, filling in your model's location for $MODEL_PATH.

Property-conditioned generation

With a fine-tuned model, you can generate materials conditioned on a target property. For example, to sample from the model trained on magnetic density, you can run the following command. ``bash export MODEL_NAME=dft_mag_density export RESULTS_PATH="results/$MODEL_NAME/" # Samples will be written to this directory, e.g.,results/dftmagdensity`

Generate conditional samples with a target magnetic density of 0.15

mattergen-generate $RESULTSPATH --pretrained-name=$MODELNAME --batchsize=16 --propertiestoconditionon="{'dftmagdensity': 0.15}" --diffusionguidancefactor=2.0 ```

[!TIP] The argument --diffusion-guidance-factor corresponds to the $\gamma$ parameter in classifier-free diffusion guidance. Setting it to zero corresponds to unconditional generation, and increasing it further tends to produce samples which adhere more to the input property values, though at the expense of diversity and realism of samples.

Multiple property-conditioned generation

You can also generate materials conditioned on more than one property. For instance, you can use the pre-trained model located at checkpoints/chemical_system_energy_above_hull to generate conditioned on chemical system and energy above the hull, or the model at checkpoints/dft_mag_density_hhi_score for joint conditioning on HHI score and magnetic density. Adapt the following command to your specific needs: bash export MODEL_NAME=chemical_system_energy_above_hull export RESULTS_PATH="results/$MODEL_NAME/" # Samples will be written to this directory, e.g., `results/dft_mag_density` mattergen-generate $RESULTS_PATH --pretrained-name=$MODEL_NAME --batch_size=16 --properties_to_condition_on="{'energy_above_hull': 0.05, 'chemical_system': 'Li-O'}" --diffusion_guidance_factor=2.0

Evaluation

Once you have generated a list of structures contained in $RESULTS_PATH (either using MatterGen or another method), you can relax the structures using the default MatterSim machine learning force field (see repository) and compute novelty, uniqueness, stability (using energy estimated by MatterSim), and other metrics via the following command: bash git lfs pull -I data-release/alex-mp/reference_MP2020correction.gz --exclude="" # first download the reference dataset from Git LFS mattergen-evaluate --structures_path=$RESULTS_PATH --relax=True --structure_matcher='disordered' --save_as="$RESULTS_PATH/metrics.json" This script will write metrics.json containing the metric results to $RESULTS_PATH and will print it to your console.

[!IMPORTANT] The evaluation script in this repository uses MatterSim, a machine-learning force field (MLFF) to relax structures and assess their stability via MatterSim's predicted energies. While this is orders of magnitude faster than evaluation via density functional theory (DFT), it doesn't require a license to run the evaluation, and typically has a high accuracy, there are important caveats. (1) In the MatterGen publication we use DFT to evaluate structures generated by all models and baselines; (2) DFT is more accurate and reliable, particularly in less common chemical systems. Thus, evaluation results obtained with this evaluation code may give different results than DFT evaluation; and we recommend to confirm results obtained with MLFFs with DFT before drawing conclusions.

[!TIP] By default, this uses MatterSim-v1-1M. If you would like to use the larger MatterSim-v1-5M model, you can add the --potential_load_path="MatterSim-v1.0.0-5M.pth" argument. You may also check the MatterSim repository for the latest version of the model.

If, instead, you have relaxed the structures and obtained the relaxed total energies via another mean (e.g., DFT), you can evaluate the metrics via: bash git lfs pull -I data-release/alex-mp/reference_MP2020correction.gz --exclude="" # first download the reference dataset from Git LFS mattergen-evaluate --structures_path=$RESULTS_PATH --energies_path='energies.npy' --relax=False --structure_matcher='disordered' --save_as='metrics' This script will try to read structures from disk in the following precedence order: * If $RESULTS_PATH points to a .xyz or .extxyz file, it will read it directly and assume each frame is a different structure. * If $RESULTS_PATH points to a .zip file containing .cif files, it will first extract and then read the cif files. * If $RESULTS_PATH points to a directory, it will read all .cif, .xyz, or .extxyz files in the order they occur in os.listdir.

Here, we expect energies.npy to be a numpy array with the entries being float energies in the same order as the structures read from $RESULTS_PATH.

If you want to save the relaxed structures, toghether with their energies, forces, and stresses, add --structures_output_path=YOUR_PATH to the script call, like so: bash mattergen-evaluate --structures_path=$RESULTS_PATH --relax=True --structure_matcher='disordered' --save_as='metrics' --structures_output_path="relaxed_structures.extxyz"

Benchmark

In plot_benchmark_results.ipynb we provide a Jupyter notebook to generate figures like Figs. 2e and 2f in the paper. We further provide the resulting metrics of analyzing samples generated by several baselines under benchmark/metrics. You can add your own model's results by copying the metrics JSON file resulting from mattergen-evaluate into the same folder. Note, again, that these results were obtained via MatterSim relaxation and energies, so results will differ from those obtained via DFT (e.g., as those in the paper).

S.U.N. plot RMSD plot

For convenience, here are the numerical results from Figs. 2e and 2f in the paper (as well as Table D4 in the supplementary information):

Model | % S.U.N. | RMSD | % Stable | % Unique | % Novel ------|----------|------|----------|----------|--------| MatterGen | 38.57 | 0.021 | 74.41 | 100.0 | 61.96 MatterGen MP20 | 22.27 | 0.110 | 42.19 | 100.0 | 75.44 DiffCSP Alex-MP-20 | 33.27 | 0.104 | 63.33 | 99.90 | 66.94 DiffCSP MP20 | 12.71 | 0.232 | 36.23 | 100.0 | 70.73 CDVAE | 13.99 | 0.359 | 19.31 | 100.0 | 92.00 FTCP | 0.0 | 1.492 | 0.0 | 100.0 | 100.0 G-SchNet | 0.98 | 1.347 | 1.63 | 100.0 | 98.23 P-G-SchNet | 1.29 | 1.360 | 3.11 | 100.0 | 88.40

Evaluate using your own reference dataset

[!IMPORTANT] If you are planning to use MatterSim to evaluate the stability of the generated structures, then the reference dataset you provide must contain energies that are compatible with MatterSim, meaning they should be either DFT-computed energies calculated according to the Materials Project Compatbility scheme, or energies directly computed with MatterSim.

If you want to use your own custom dataset for evaluation, you first need to serialize and save it by doing so:

``` python from mattergen.evaluation.reference.referencedataset import ReferenceDataset from mattergen.evaluation.reference.referencedataset_serializer import LMDBGZSerializer

referencedataset = ReferenceDataset.fromentries(name="myreferencedataset", entries=entries) LMDBGZSerializer().serialize(referencedataset, "pathto_file.gz") ```

where entries is a list of pymatgen.entries.computed_entries.ComputedStructureEntry objects containing structure-energy pairs for each structure.

By default, we apply the MaterialsProject2020Compatibility energy correction scheme to all input structures during evaluation, and assume that the reference dataset has already been pre-processed using the same compatibility scheme. Therefore, unless you have already done this, you should obtain the entries object for your custom reference dataset in the following way:

``` python from mattergen.evaluation.utils.vasprunlike import VasprunLike from pymatgen.entries.compatibility import MaterialsProject2020Compatibility

entries = [] for structure, energy in zip(structures, energies) vasprunlike = VasprunLike(structure=structure, energy=energy) entries.append(vasprunlike.getcomputedentry( incstructure=True, energycorrection_scheme=MaterialsProject2020Compatibility() )) ```

Train MatterGen yourself

Before we can train MatterGen from scratch, we have to unpack and preprocess the dataset files.

Pre-process a dataset for training

You can run the following command for mp_20: ```bash

Download file from LFS

git lfs pull -I data-release/mp-20/ --exclude="" unzip data-release/mp-20/mp20.zip -d datasets csv-to-dataset --csv-folder datasets/mp20/ --dataset-name mp20 --cache-folder datasets/cache `` You will get preprocessed data files indatasets/cache/mp20`.

To preprocess our larger alex_mp_20 dataset, run: ```bash

Download file from LFS

git lfs pull -I data-release/alex-mp/alexmp20.zip --exclude="" unzip data-release/alex-mp/alexmp20.zip -d datasets csv-to-dataset --csv-folder datasets/alexmp20/ --dataset-name alexmp20 --cache-folder datasets/cache `` This will take some time (~1h). You will get preprocessed data files indatasets/cache/alexmp20`.

Training

You can train the MatterGen base model on mp_20 using the following command.

bash mattergen-train data_module=mp_20 ~trainer.logger

[!NOTE] For Apple Silicon training, add ~trainer.strategy trainer.accelerator=mps to the above command.

The validation loss (loss_val) should reach 0.4 after 360 epochs (about 80k steps). The output checkpoints can be found at outputs/singlerun/${now:%Y-%m-%d}/${now:%H-%M-%S}. We call this folder $MODEL_PATH for future reference.

[!NOTE] We use hydra to configure our training and sampling jobs. The hierarchical configuration can be found under mattergen/conf. In the following we make use of hydra's config overrides to update these configs via the CLI. See the hydra documentation for an introduction to the config override syntax.

[!TIP] By default, we disable Weights & Biases (W&B) logging via the ~trainer.logger config override. You can enable it by removing this override. In mattergen/conf/trainer/default.yaml, you may enter your W&B logging info or specify your own logger.

To train the MatterGen base model on alex_mp_20, use the following command: bash mattergen-train data_module=alex_mp_20 ~trainer.logger trainer.accumulate_grad_batches=4

[!NOTE] For Apple Silicon training, add ~trainer.strategy trainer.accelerator=mps to the above command.

[!TIP] Note that a single GPU's memory usually is not enough for the batch size of 512, hence we accumulate gradients over 4 batches. If you still run out of memory, increase this further.

Crystal structure prediction

Even though not a focus of our paper, you can also train MatterGen in crystal structure prediction (CSP) mode, where it does not denoise the atom types during generation. This gives you the ability to condition on a specific chemical formula for generation. You can train MatterGen in this mode by passing --config-name=csp to run.py.

To sample from this model, pass --target_compositions=['{"<element1>": <number_of_element1_atoms>, "<element2>": <number_of_element2_atoms>, ..., "<elementN>": <number_of_elementN_atoms>}'] --sampling-config-name=csp to generate.py. An example composition could be --target_compositions=['{"Na": 1, "Cl": 1}'].

Fine-tuning on property data

You can fine-tune the MatterGen base model using the following command.

bash export PROPERTY=dft_mag_density mattergen-finetune adapter.pretrained_name=mattergen_base data_module=mp_20 +lightning_module/diffusion_module/model/property_embeddings@adapter.adapter.property_embeddings_adapt.$PROPERTY=$PROPERTY ~trainer.logger data_module.properties=["$PROPERTY"] dft_mag_density denotes the target property for fine-tuning. You can also fine-tune a model you've trained yourself by replacing adapter.pretrained_name=mattergen_base with adapter.model_path=$MODEL_PATH, filling in your model's location for $MODEL_PATH.

[!NOTE] For Apple Silicon training, add ~trainer.strategy trainer.accelerator=mps to the above command.

[!TIP] You can select any property that is available in the dataset. See mattergen/conf/data_module/mp_20.yaml or mattergen/conf/data_module/alex_mp_20.yaml for the list of supported properties. You can also add your own custom property data. See below for instructions.

Multi-property fine-tuning

You can also fine-tune MatterGen on multiple properties. For instance, to fine-tune it on dft_mag_density and dft_band_gap, you can use the following command.

bash export PROPERTY1=dft_mag_density export PROPERTY2=dft_band_gap export MODEL_NAME=mattergen_base mattergen-finetune adapter.pretrained_name=$MODEL_NAME data_module=mp_20 +lightning_module/diffusion_module/model/property_embeddings@adapter.adapter.property_embeddings_adapt.$PROPERTY1=$PROPERTY1 +lightning_module/diffusion_module/model/property_embeddings@adapter.adapter.property_embeddings_adapt.$PROPERTY2=$PROPERTY2 ~trainer.logger data_module.properties=["$PROPERTY1","$PROPERTY2"]

[!TIP] Add more properties analogously by adding these overrides: 1. +lightning_module/diffusion_module/model/property_embeddings@adapter.adapter.property_embeddings_adapt.<my_property>=<my_property> 2. Add <my_property> to the data_module.properties=["$PROPERTY1","$PROPERTY2",...,<my_property>] override.

[!NOTE] For Apple Silicon training, add ~trainer.strategy trainer.accelerator=mps to the above command.

Fine-tune on your own property data

You may also fine-tune MatterGen on your own property data. Essentially what you need is a property value (typically float) for a subset of the data you want to train on (e.g., alex_mp_20). Proceed as follows: 1. Add the name of your property to the PROPERTY_SOURCE_IDS list inside mattergen/common/utils/globals.py. 2. Add a new column with this name to the dataset(s) you want to train on, e.g., datasets/alex_mp_20/train.csv and datasets/alex_mp_20/val.csv (requires you to have followed the pre-processing steps). 3. Re-run the CSV to dataset script csv-to-dataset --csv-folder datasets/<MY_DATASET>/ --dataset-name <MY_DATASET> --cache-folder datasets/cache, substituting your dataset name for MY_DATASET. 4. Add a <your_property>.yaml config file to mattergen/conf/lightning_module/diffusion_module/model/property_embeddings. If you are adding a float-valued property, you may copy an existing configuration, e.g., dft_mag_density.yaml. More complicated properties will require you to create your own custom PropertyEmbedding subclass, e.g., see the space_group or chemical_system configs. 5. Follow the instructions for fine-tuning and reference your own property in the same way as we used the existing properties like dft_mag_density.

Data release

We provide datasets to train as well as evaluate MatterGen. For more details and license information see the respective README files under data-release.

Training datasets

  • MP-20 (Jain et al., 2013): contains 45k general inorganic materials, including most experimentally known materials with no more than 20 atoms in unit cell.
  • Alex-MP-20: Training dataset consisting of around 600k structures from MP-20 and Alexandria (Schmidt et al. 2022) with at most 20 atoms inside the unit cell and below 0.1 eV/atom of the convex hull. See the venn diagram below and the MatterGen paper for more details.

Reference dataset

We further provide the Alex-MP reference dataset which can be used to evaluate novelty and stability of generated samples. The reference set contains 845,997 structures with their DFT energies. See the following Venn diagram for more details about the composition of the training and reference datasets.

[!NOTE] For license reasons, we cannot share the 4.4k ordered + 117.7k disordered ICSD structures, so results may differ from those in the paper.

Dataset Venn diagram

CIFs and experimental measurements

The data-release directory also contains the CIF files to all structures shown in the paper as well as xps, xrd, and nanoindentation measurements of the TaCr2O6 sample presented in the paper.

Citation

If you are using our code, model, data, or evaluation pipeline, please consider citing our work: bibtex @article{MatterGen2025, author = {Zeni, Claudio and Pinsler, Robert and Z{\"u}gner, Daniel and Fowler, Andrew and Horton, Matthew and Fu, Xiang and Wang, Zilong and Shysheya, Aliaksandra and Crabb{\'e}, Jonathan and Ueda, Shoko and Sordillo, Roberto and Sun, Lixin and Smith, Jake and Nguyen, Bichlien and Schulz, Hannes and Lewis, Sarah and Huang, Chin-Wei and Lu, Ziheng and Zhou, Yichi and Yang, Han and Hao, Hongxia and Li, Jielan and Yang, Chunlei and Li, Wenjie and Tomioka, Ryota and Xie, Tian}, journal = {Nature}, title = {A generative model for inorganic materials design}, year = {2025}, doi = {10.1038/s41586-025-08628-5}, }

Trademarks

This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow Microsoft's Trademark & Brand Guidelines. Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies.

Responsible AI Transparency Documentation

The responsible AI transparency documentation can be found here.

Get in touch

If you have any questions not covered here, please ask a questions in the Q&A section of Discussions. If you want to report a bug or propose a feature, create an Issue using the template and / or open a pull request.

Owner

  • Name: Microsoft
  • Login: microsoft
  • Kind: organization
  • Email: opensource@microsoft.com
  • Location: Redmond, WA

Open source projects and samples from Microsoft

Committers

Last synced: 10 months ago

All Time
  • Total Commits: 60
  • Total Committers: 9
  • Avg Commits per committer: 6.667
  • Development Distribution Score (DDS): 0.683
Past Year
  • Commits: 60
  • Committers: 9
  • Avg Commits per committer: 6.667
  • Development Distribution Score (DDS): 0.683
Top Committers
Name Email Commits
Claudio Zeni c****i@m****m 19
Daniel Zügner d****r@g****m 17
Daniel Zuegner d****r@m****m 16
Claudio Zeni 3****i 3
microsoft-github-operations[bot] 5****] 1
Jiacheng Wang 6****7 1
Ikko Eltociear Ashimine e****r@g****m 1
Emmanuel Ferdman e****n@g****m 1
CharlesCNorton 1****n 1
Committer Domains (Top 20 + Academic)

Issues and Pull Requests

Last synced: 7 months ago

All Time
  • Total issues: 118
  • Total pull requests: 75
  • Average time to close issues: 4 days
  • Average time to close pull requests: 1 day
  • Total issue authors: 62
  • Total pull request authors: 13
  • Average comments per issue: 1.98
  • Average comments per pull request: 0.15
  • Merged pull requests: 58
  • Bot issues: 1
  • Bot pull requests: 6
Past Year
  • Issues: 118
  • Pull requests: 75
  • Average time to close issues: 4 days
  • Average time to close pull requests: 1 day
  • Issue authors: 62
  • Pull request authors: 13
  • Average comments per issue: 1.98
  • Average comments per pull request: 0.15
  • Merged pull requests: 58
  • Bot issues: 1
  • Bot pull requests: 6
Top Authors
Issue Authors
  • yuhao1982 (19)
  • Ruziy (12)
  • Mofahdi (9)
  • yuedawang (5)
  • chiku-parida (3)
  • msehabibur (3)
  • ditto7284 (3)
  • 401-Nick (2)
  • Andy5256 (2)
  • wigging (2)
  • parkjunkil (2)
  • asedova (2)
  • janklinux (2)
  • awvwgk (2)
  • Jonathan-park17 (2)
Pull Request Authors
  • danielzuegner (35)
  • ClaudioZeni (14)
  • dependabot[bot] (5)
  • emmanuel-ferdman (3)
  • jcwang587 (2)
  • CharlesCNorton (2)
  • eltociear (2)
  • xiaodoushabing (2)
  • oib12 (1)
  • luisbro (1)
  • rongaoli (1)
Top Labels
Issue Labels
question (7) bug (4) enhancement (1)
Pull Request Labels
dependencies (5) python (5) documentation (4) enhancement (3) bug (2)

Packages

  • Total packages: 2
  • Total downloads:
    • pypi 63 last-month
  • Total dependent packages: 0
    (may contain duplicates)
  • Total dependent repositories: 0
    (may contain duplicates)
  • Total versions: 5
  • Total maintainers: 2
proxy.golang.org: github.com/microsoft/mattergen
  • Versions: 4
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent packages count: 5.8%
Average: 6.0%
Dependent repos count: 6.2%
Last synced: 7 months ago
pypi.org: mattergen

MatterGen is a generative model for inorganic materials design across the periodic table that can be fine-tuned to steer the generation towards a wide range of property constraints.

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
  • Downloads: 63 Last month
Rankings
Dependent packages count: 8.8%
Average: 29.2%
Dependent repos count: 49.5%
Maintainers (2)
Last synced: 7 months ago

Dependencies

.github/workflows/component_detection.yml actions
  • actions/checkout v4 composite
  • actions/setup-python v5 composite
  • advanced-security/component-detection-dependency-submission-action v0.0.4 composite
pyproject.toml pypi
  • GitPython *
  • SMACT *
  • ase >=3.22.1
  • atomate2 >=0.0.13
  • autopep8 *
  • bunnet ==1.2.0
  • cachetools *
  • contextlib2 *
  • custodian *
  • cvxpy *
  • cython *
  • dill *
  • e3nn >=0.5.1
  • emmet-core >=0.84.2
  • fire *
  • hydra-core ==1.3.1
  • hydra-joblib-launcher ==1.1.5
  • ipywidgets *
  • jobflow *
  • jupyterlab >=4.2.5
  • lmdb *
  • maggma *
  • matminer @git+https://github.com/hackingmaterials/matminer.git@39d93aba11c4b2ad98cc46433fedba4125373b2b
  • matplotlib ==3.8.4
  • matscipy >=0.7.0
  • mattersim @ git+https://github.com/microsoft/mattersim.git
  • mongomock >=4.1.2
  • monty ==2024.7.30
  • mp-api *
  • multiprocess *
  • nglview *
  • notebook >=7.2.2
  • numpy <2.0
  • omegaconf ==2.3.0
  • p-tqdm >=1.4.0
  • petname >=2.6
  • protobuf ~=3.20
  • pyarrow *
  • pydantic [email]>=2.5.1
  • pyg-lib *
  • pylint *
  • pymatgen >=2024.6.4
  • pymatgen-analysis-alloys *
  • pymatviz ==0.8.2
  • pymongo *
  • pytest *
  • pytorch-lightning ==2.0.6
  • rich-click <1.8.0
  • seaborn *
  • setuptools *
  • streamlit *
  • sympy >=1.11.1
  • tensorboard *
  • torch ==2.2.1+cu118
  • torch_cluster *
  • torch_geometric >=2.5
  • torch_scatter *
  • torch_sparse *
  • torch_spline_conv *
  • torchaudio ==2.2.1+cu118
  • torchvision ==0.17.1+cu118
  • tqdm *
  • urllib3 <2.0.0
  • wandb >=0.10.33