opt\_einsum - A Python package for optimizing contraction order for einsum-like expressions
opt\_einsum - A Python package for optimizing contraction order for einsum-like expressions - Published in JOSS (2018)
Science Score: 100.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
✓DOI references
Found 6 DOI reference(s) in README and JOSS metadata -
✓Academic publication links
Links to: joss.theoj.org -
✓Committers with academic emails
3 of 28 committers (10.7%) from academic institutions -
○Institutional organization owner
-
✓JOSS paper metadata
Published in Journal of Open Source Software
Keywords
Repository
⚡️Optimizing einsum functions in NumPy, Tensorflow, Dask, and more with contraction order optimization.
Basic Info
- Host: GitHub
- Owner: dgasmith
- License: mit
- Language: Python
- Default Branch: main
- Homepage: https://dgasmith.github.io/opt_einsum/
- Size: 4.07 MB
Statistics
- Stars: 939
- Watchers: 20
- Forks: 74
- Open Issues: 35
- Releases: 20
Topics
Metadata Files
README.md
Optimized Einsum
Optimized Einsum: A tensor contraction order optimizer
Optimized einsum can significantly reduce the overall execution time of einsum-like expressions (e.g.,
np.einsum,
dask.array.einsum,
pytorch.einsum,
tensorflow.einsum,
)
by optimizing the expression's contraction order and dispatching many
operations to canonical BLAS, cuBLAS, or other specialized routines.
Optimized einsum is agnostic to the backend and can handle NumPy, Dask, PyTorch, Tensorflow, CuPy, Sparse, Theano, JAX, and Autograd arrays as well as potentially any library which conforms to a standard API. See the documentation for more information.
Example usage
The opt_einsum.contract
function can often act as a drop-in replacement for einsum
functions without further changes to the code while providing superior performance.
Here, a tensor contraction is performed with and without optimization:
```python import numpy as np from opt_einsum import contract
N = 10 C = np.random.rand(N, N) I = np.random.rand(N, N, N, N)
%timeit np.einsum('pi,qj,ijkl,rk,sl->pqrs', C, C, I, C, C) 1 loops, best of 3: 934 ms per loop
%timeit contract('pi,qj,ijkl,rk,sl->pqrs', C, C, I, C, C) 1000 loops, best of 3: 324 us per loop ```
In this particular example, we see a ~3000x performance improvement which is not uncommon when compared against unoptimized contractions. See the backend examples for more information on using other backends.
Features
The algorithms found in this repository often power the einsum optimizations
in many of the above projects. For example, the optimization of np.einsum
has been passed upstream and most of the same features that can be found in
this repository can be enabled with np.einsum(..., optimize=True). However,
this repository often has more up to date algorithms for complex contractions.
The following capabilities are enabled by opt_einsum:
- Inspect detailed information about the path chosen.
- Perform contractions with numerous backends, including on the GPU and with libraries such as TensorFlow and PyTorch.
- Generate reusable expressions, potentially with constant tensors, that can be compiled for greater performance.
- Use an arbitrary number of indices to find contractions for hundreds or even thousands of tensors.
- Share intermediate computations among multiple contractions.
- Compute gradients of tensor contractions using autograd or jax
Please see the documentation for more features!
Installation
opt_einsum can either be installed via pip install opt_einsum or from conda conda install opt_einsum -c conda-forge.
See the installation documentation for further methods.
Citation
If this code has benefited your research, please support us by citing:
Daniel G. A. Smith and Johnnie Gray, opt_einsum - A Python package for optimizing contraction order for einsum-like expressions. Journal of Open Source Software, 2018, 3(26), 753
DOI: https://doi.org/10.21105/joss.00753
Contributing
All contributions, bug reports, bug fixes, documentation improvements, enhancements, and ideas are welcome.
A detailed overview on how to contribute can be found in the contributing guide.
Owner
- Name: Daniel Smith
- Login: dgasmith
- Kind: user
- Location: Boston, MA
- Company: Flagship Pioneering
- Repositories: 65
- Profile: https://github.com/dgasmith
Twitter: @dga_smith
JOSS Publication
opt\_einsum - A Python package for optimizing contraction order for einsum-like expressions
Authors
Tags
array tensors optimization phylogenetics natural selection molecular evolutionCitation (CITATION.cff)
cff-version: 1.1.0
message: "If you use this software, please cite it as below."
authors:
- family-names: Smith
given-names: Daniel
orcid: https://orcid.org/0000-0001-8626-0900
- family-names: Gray
given-names: Johnnie
orcid: https://orcid.org/0000-0001-9461-3024
title: "`opt_einsum` - A Python package for optimizing contraction order for einsum-like expressions"
version: 3.3.0
doi: 10.21105/joss.00753
date-released: 2019-06-28
url: "https://github.com/dgasmith/opt_einsum"
GitHub Events
Total
- Issues event: 11
- Watch event: 81
- Delete event: 1
- Issue comment event: 26
- Push event: 6
- Pull request review event: 4
- Pull request event: 7
- Fork event: 7
Last Year
- Issues event: 11
- Watch event: 81
- Delete event: 1
- Issue comment event: 26
- Push event: 6
- Pull request review event: 4
- Pull request event: 7
- Fork event: 7
Committers
Last synced: 5 months ago
Top Committers
| Name | Commits | |
|---|---|---|
| Daniel Smith | m****n@m****m | 187 |
| jcmgray | j****4@u****k | 82 |
| Fritz Obermeyer | f****o@u****m | 27 |
| Daniel Smith | d****7@g****m | 6 |
| Fabian-Robert Stöter | m****l@f****m | 3 |
| mrader1248 | 3****8 | 2 |
| stonebig | s****4@g****m | 2 |
| johnthagen | j****n | 1 |
| ax7e | 5****e | 1 |
| Andrew Sears | a****s | 1 |
| Arfon Smith | a****n | 1 |
| Colin Watson | c****n@d****g | 1 |
| Greg Roodt | g****t@g****m | 1 |
| Hongxu Jia | h****3@g****m | 1 |
| Jane (Yuan) Xu | 3****9 | 1 |
| Kian Meng Ang | k****g@g****m | 1 |
| Lori A. Burns | l****s@g****m | 1 |
| Lukas Geiger | l****4@g****m | 1 |
| Neil Girdhar | m****k@g****m | 1 |
| Nils Werner | n****r@a****e | 1 |
| Peter Hawkins | h****p@c****u | 1 |
| Robert T. McGibbon | r****o@g****m | 1 |
| Roman Novak | 4****g | 1 |
| Salvatore Mandrà | s****a@n****v | 1 |
| Samuel St-Jean | s****m@g****m | 1 |
| Scott Sievert | g****b@s****m | 1 |
| Weitang Li | l****1@1****m | 1 |
| Yuji Kanagawa | y****e@g****m | 1 |
Committer Domains (Top 20 + Academic)
Issues and Pull Requests
Last synced: 4 months ago
All Time
- Total issues: 84
- Total pull requests: 57
- Average time to close issues: 8 months
- Average time to close pull requests: about 2 months
- Total issue authors: 59
- Total pull request authors: 22
- Average comments per issue: 3.65
- Average comments per pull request: 2.35
- Merged pull requests: 47
- Bot issues: 0
- Bot pull requests: 0
Past Year
- Issues: 6
- Pull requests: 13
- Average time to close issues: about 1 month
- Average time to close pull requests: 7 days
- Issue authors: 6
- Pull request authors: 5
- Average comments per issue: 0.17
- Average comments per pull request: 1.31
- Merged pull requests: 8
- Bot issues: 0
- Bot pull requests: 0
Top Authors
Issue Authors
- jcmgray (6)
- dgasmith (5)
- Geositta2000 (5)
- refraction-ray (4)
- ghost (3)
- yaroslavvb (3)
- fangzhangmnm (2)
- philip-bl (2)
- Lancashire3000 (2)
- BMOHARRIS (2)
- eamartin (1)
- rht (1)
- Tomohirohashizume (1)
- fishjojo (1)
- srush (1)
Pull Request Authors
- dgasmith (28)
- jcmgray (13)
- janeyx99 (3)
- NeilGirdhar (2)
- lgeiger (2)
- nova77 (2)
- juanjosegarciaripoll (2)
- cjwatson (2)
- rlouf (2)
- DanisNone (2)
- hongxu-jia (1)
- hawkinsp (1)
- kngwyu (1)
- ax7e (1)
- kianmeng (1)
Top Labels
Issue Labels
Pull Request Labels
Packages
- Total packages: 18
- Total downloads: unknown
-
Total dependent packages: 23
(may contain duplicates) -
Total dependent repositories: 439
(may contain duplicates) - Total versions: 69
- Total maintainers: 2
proxy.golang.org: github.com/dgasmith/opt_einsum
- Documentation: https://pkg.go.dev/github.com/dgasmith/opt_einsum#section-documentation
- License: mit
-
Latest release: v3.4.0+incompatible
published about 1 year ago
Rankings
alpine-edge: py3-opt_einsum-doc
Optimizing einsum functions in NumPy, Tensorflow, Dask, and more with contraction order optimization (documentation)
- Homepage: https://github.com/dgasmith/opt_einsum
- License: MIT
-
Latest release: 3.4.0-r1
published about 1 year ago
Rankings
Maintainers (1)
spack.io: py-opt-einsum
Optimized Einsum: A tensor contraction order optimizer.
- Homepage: https://github.com/dgasmith/opt_einsum
- License: []
-
Latest release: 3.4.0
published about 1 year ago
Rankings
Maintainers (1)
alpine-edge: py3-opt_einsum-pyc
Precompiled Python bytecode for py3-opt_einsum
- Homepage: https://github.com/dgasmith/opt_einsum
- License: MIT
-
Latest release: 3.4.0-r1
published about 1 year ago
Rankings
Maintainers (1)
alpine-edge: py3-opt_einsum
Optimizing einsum functions in NumPy, Tensorflow, Dask, and more with contraction order optimization
- Homepage: https://github.com/dgasmith/opt_einsum
- License: MIT
-
Latest release: 3.4.0-r1
published about 1 year ago
Rankings
Maintainers (1)
conda-forge.org: opt_einsum
Einsum is a very powerful function for contracting tensors of arbitrary dimension and index. However, it is typically only optimized to contract two terms at a time resulting in non-optimal scaling. This package optimizes the contraction order for arbitrarily large speedups. See the docs for more information: dgasmith.github.io/opt_einsum/
- Homepage: http://github.com/dgasmith/opt_einsum
- License: MIT
-
Latest release: 3.3.0
published over 5 years ago
Rankings
conda-forge.org: opt-einsum
Einsum is a very powerful function for contracting tensors of arbitrary dimension and index. However, it is typically only optimized to contract two terms at a time resulting in non-optimal scaling. This package optimizes the contraction order for arbitrarily large speedups. See the docs for more information: dgasmith.github.io/opt_einsum/
- Homepage: http://github.com/dgasmith/opt_einsum
- License: MIT
-
Latest release: 3.0.0
published over 6 years ago
Rankings
anaconda.org: opt_einsum
Einsum is a very powerful function for contracting tensors of arbitrary dimension and index. However, it is typically only optimized to contract two terms at a time resulting in non-optimal scaling. This package optimizes the contraction order for arbitrarily large speedups. See the docs for more information: http://optimized-einsum.readthedocs.io
- Homepage: http://github.com/dgasmith/opt_einsum
- License: MIT
-
Latest release: 3.3.0
published over 4 years ago
Rankings
alpine-v3.20: py3-opt_einsum
Optimizing einsum functions in NumPy, Tensorflow, Dask, and more with contraction order optimization
- Homepage: https://github.com/dgasmith/opt_einsum
- License: MIT
-
Latest release: 3.3.0-r2
published over 1 year ago
Rankings
Maintainers (1)
alpine-v3.19: py3-opt_einsum-pyc
Precompiled Python bytecode for py3-opt_einsum
- Homepage: https://github.com/dgasmith/opt_einsum
- License: MIT
-
Latest release: 3.3.0-r1
published over 2 years ago
Rankings
Maintainers (1)
alpine-v3.21: py3-opt_einsum
Optimizing einsum functions in NumPy, Tensorflow, Dask, and more with contraction order optimization
- Homepage: https://github.com/dgasmith/opt_einsum
- License: MIT
-
Latest release: 3.4.0-r1
published about 1 year ago
Rankings
Maintainers (1)
alpine-v3.22: py3-opt_einsum-pyc
Precompiled Python bytecode for py3-opt_einsum
- Homepage: https://github.com/dgasmith/opt_einsum
- License: MIT
-
Latest release: 3.4.0-r1
published about 1 year ago
Rankings
Maintainers (1)
alpine-v3.20: py3-opt_einsum-pyc
Precompiled Python bytecode for py3-opt_einsum
- Homepage: https://github.com/dgasmith/opt_einsum
- License: MIT
-
Latest release: 3.3.0-r2
published over 1 year ago
Rankings
Maintainers (1)
alpine-v3.21: py3-opt_einsum-pyc
Precompiled Python bytecode for py3-opt_einsum
- Homepage: https://github.com/dgasmith/opt_einsum
- License: MIT
-
Latest release: 3.4.0-r1
published about 1 year ago
Rankings
Maintainers (1)
alpine-v3.22: py3-opt_einsum-doc
Optimizing einsum functions in NumPy, Tensorflow, Dask, and more with contraction order optimization (documentation)
- Homepage: https://github.com/dgasmith/opt_einsum
- License: MIT
-
Latest release: 3.4.0-r1
published about 1 year ago
Rankings
Maintainers (1)
alpine-v3.21: py3-opt_einsum-doc
Optimizing einsum functions in NumPy, Tensorflow, Dask, and more with contraction order optimization (documentation)
- Homepage: https://github.com/dgasmith/opt_einsum
- License: MIT
-
Latest release: 3.4.0-r1
published about 1 year ago
Rankings
Maintainers (1)
alpine-v3.22: py3-opt_einsum
Optimizing einsum functions in NumPy, Tensorflow, Dask, and more with contraction order optimization
- Homepage: https://github.com/dgasmith/opt_einsum
- License: MIT
-
Latest release: 3.4.0-r1
published about 1 year ago
Rankings
Maintainers (1)
alpine-v3.19: py3-opt_einsum
Optimizing einsum functions in NumPy, Tensorflow, Dask, and more with contraction order optimization
- Homepage: https://github.com/dgasmith/opt_einsum
- License: MIT
-
Latest release: 3.3.0-r1
published over 2 years ago
Rankings
Maintainers (1)
Dependencies
- actions/checkout v2 composite
- actions/setup-python v2 composite
- actions/checkout v2 composite
- actions/setup-python v2.2.1 composite
- conda-incubator/setup-miniconda v2 composite
- actions/checkout v2 composite
- codecov/codecov-action v1 composite
- conda-incubator/setup-miniconda v2 composite
