MACE

MACE: a Machine-learning Approach to Chemistry Emulation - Published in JOSS (2025)

https://github.com/silkemaes/mace

Science Score: 98.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
    Found 1 DOI reference(s) in JOSS metadata
  • Academic publication links
  • Committers with academic emails
  • Institutional organization owner
  • JOSS paper metadata
    Published in Journal of Open Source Software

Keywords

astrochemistry astrophysics autoencoder machine-learning simulation
Last synced: 4 months ago · JSON representation ·

Repository

MACE, a Machine-learning Approach to Chemistry Emulation

Basic Info
Statistics
  • Stars: 2
  • Watchers: 2
  • Forks: 5
  • Open Issues: 2
  • Releases: 3
Topics
astrochemistry astrophysics autoencoder machine-learning simulation
Created almost 3 years ago · Last pushed 9 months ago
Metadata Files
Readme License Citation

README.md

MACE

Welcome to the MACE repository!

MACE, a Machine-learning Approach to Chemistry Emulation, by Maes et al. (2024), is a surrogate model for chemical kinetics. It is developed in the contexts of circumstellar envelopes (CSEs) of asymptotic giant branch (AGB) stars, i.e. evolved low-mass stars.

During development, the chemical models of Maes et al. (2023) are used. In this paper you can also find more details about the astrochemical environment used.

MACE is implemented in Python and is trained using PyTorch, together with torchode (Lienen & Gunnemann, 2022).


Table of content

Please find documentation on the code in the scripts. On the documentation page, we provide general info on MACE and tutorial notebooks. - Installation - What is MACE? - How to use? - Example case - Contributions - Contact - Acknowledgements


Notes on installation

  • MACE is currently not available as a package on pypi. There is a package named mace, but it is not this one.
  • To use MACE, please clone the repo and install the required packages, see requirements.txt: git clone https://github.com/silkemaes/MACE.git

What is MACE?

MACE offers a surrogate model that emulates the evolution of chemical abundances over time in a dynamical physical environment. As the name states, it makes use of machine-learning techniques. More specifically, combining an autoencoder (blue) and a trainable ordinary differential equation (ODE) (red) allows to accurately emulate a chemical kinetics model.

Hence, MACE is a framework, an architecture, that can be trained for specific chemical datasets, but before using, should be made compatible with the dataset, see How to use?.

The architecture of MACE is schematically given as MACE architecture

MACE offers a surrogate model that emulates the evolution of chemical abundances over time in a dynamical physical environment. As the name states, it makes use of machine-learning techniques. More specifically, combining an autoencoder (blue) and a trainable ordinary differential equation (ODE) (red) allows to accurately emulate a chemical kinetics model.

In formula, MACE is stated as

$$ {\hat{\boldsymbol{n}}}(t) = \mathcal{D}\Big( G \big( \mathcal{E} ({\boldsymbol{n}}, {\boldsymbol{p}}),t \big) \Big). $$

Here, ${\hat{\boldsymbol{n}}}(t)$ are the predicted chemical abundances at a time $t$ later dan the initial state ${\boldsymbol{n0}}$. $\mathcal{E}$ and $\mathcal{D}$ represent the autoecoder, with the encoder and decoder, respectively. The autoencoder maps the chemical space ${\boldsymbol{n0}}$ together with the physical space ${\boldsymbol{p}}$ to a lower dimensional representation $\boldsymbol{z}$, called the latent space. The function $G$ describes the evolution in latent space such that $\boldsymbol{z}(\Delta t) = G(\boldsymbol{z}, \Delta t)=\int_0^{\Delta t} g(\boldsymbol{z}){\rm d}t$.

For more details, check out our paper: Maes et al. (2024).


How to use?

The script routine.py gives the flow of training & storing a MACE architecture, and immediately applies to the specified test dataset once training is finished. As such, it returns an averaged error on the MACE model compared to the classical model. More info on the training routine can be found in the paper.

An annotated notebook of the routine can be found in here.

The script routine.py takes an input file with the needed (hyper)parameter setup. An example of such an input file can be found in input/. python routine.py example

Disclaimer:

In order to train MACE with a certain chemical dataset, the Dataset class should be made compatible with that data. Currently, the script src/mace/CSE_0D/dataset.py works only for the specific dataset used here, i.e. models from Maes et al. (2023), using the Rate22-CSE code.


Example case

This repository contains a trained MACE model as a test case, see model/20240604_160152.

The code for loading a trained MACE model can be found in the script src/mace/load.py, testing in src/mace/test.py. An annotated notebook can be found in the documentation.

An annotated version of the training routine can be found in routine.ipynb, which is the same as the eponymous script.


Contact

If any comments or issues come up, please contact me via email, or set up a GitHub issue.


Developers & Contributions

Developers: - Silke Maes - Frederik De Ceuster

Scientific & technical advisors: - Marie Van de Sande - Leen Decin

Contributors: - Steven Rieder

Feel free to contribute to MACE via pull requests and discussions!


Acknowledgements

The MACE architecture is free to use. Please cite our paper Maes et al. (2024).

Owner

  • Name: Silke Maes
  • Login: silkemaes
  • Kind: user
  • Location: Belgium
  • Company: KU Leuven - Institute for Astronomy

PhD student in Computational Astrophysics & Astrochemistry

JOSS Publication

MACE: a Machine-learning Approach to Chemistry Emulation
Published
April 04, 2025
Volume 10, Issue 108, Page 7148
Authors
Silke Maes ORCID
Institute of Astronomy, KU Leuven, Celestijnenlaan 200D, B-3001 Leuven, Belgium
Frederik De Ceuster ORCID
Institute of Astronomy, KU Leuven, Celestijnenlaan 200D, B-3001 Leuven, Belgium, Leuven Gravity Institute, KU Leuven, Celestijnenlaan 200D, Leuven, Belgium
Marie Van de Sande ORCID
Leiden Observatory, Leiden University, PO Box 9513, 2300 RA Leiden, The Netherlands, School of Physics and Astronomy, University of Leeds, Leeds LS2 9JT, United Kingdom
Leen Decin ORCID
Institute of Astronomy, KU Leuven, Celestijnenlaan 200D, B-3001 Leuven, Belgium, School of Chemistry, University of Leeds, Leeds LS2 9JT, United Kingdom
Editor
Josh Borrow ORCID
Tags
astrophysics chemistry surrogate model stellar winds

Citation (CITATION.cff)

cff-version: 1.1.0
message: "If you use this software, please cite it as below."
authors:
- family-names: Maes
  given-names: Silke
orcid: https://orcid.org/0000-0003-4159-9964
- family-names: De Ceuster
  given-names: Frederik
- family-names: Van de Sande
  given-names: Marie
- family-names: Decin
  given-names: Leen
title:silkemaes/MACE: a Machine-learning Approach to Chemistry Emulation
version: v1.1.0
date-released: 2025-03-11

GitHub Events

Total
  • Create event: 3
  • Release event: 4
  • Issues event: 11
  • Watch event: 1
  • Delete event: 2
  • Issue comment event: 36
  • Push event: 21
  • Pull request event: 6
  • Fork event: 3
Last Year
  • Create event: 3
  • Release event: 4
  • Issues event: 11
  • Watch event: 1
  • Delete event: 2
  • Issue comment event: 36
  • Push event: 21
  • Pull request event: 6
  • Fork event: 3

Committers

Last synced: 5 months ago

All Time
  • Total Commits: 384
  • Total Committers: 5
  • Avg Commits per committer: 76.8
  • Development Distribution Score (DDS): 0.026
Past Year
  • Commits: 43
  • Committers: 3
  • Avg Commits per committer: 14.333
  • Development Distribution Score (DDS): 0.047
Top Committers
Name Email Commits
Silke Maes s****s@k****e 374
Silke Maes s****m@p****e 5
Steven Rieder s****n@r****l 3
Warrick Ball w****l@g****m 1
Frederik De Ceuster f****r@g****m 1
Committer Domains (Top 20 + Academic)

Issues and Pull Requests

Last synced: 4 months ago

All Time
  • Total issues: 9
  • Total pull requests: 14
  • Average time to close issues: about 2 months
  • Average time to close pull requests: about 1 month
  • Total issue authors: 2
  • Total pull request authors: 3
  • Average comments per issue: 4.56
  • Average comments per pull request: 0.07
  • Merged pull requests: 9
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 9
  • Pull requests: 8
  • Average time to close issues: about 2 months
  • Average time to close pull requests: about 7 hours
  • Issue authors: 2
  • Pull request authors: 2
  • Average comments per issue: 4.56
  • Average comments per pull request: 0.0
  • Merged pull requests: 5
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • trappitsch (7)
  • richings (3)
Pull Request Authors
  • silkemaes (10)
  • rieder (6)
  • warrickball (2)
Top Labels
Issue Labels
Pull Request Labels