pygrad
Lightweight automatic differentiation engine written in Python
Science Score: 44.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
○DOI references
-
○Academic publication links
-
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (12.9%) to scientific vocabulary
Repository
Lightweight automatic differentiation engine written in Python
Basic Info
- Host: GitHub
- Owner: talveth
- License: mit
- Language: Python
- Default Branch: master
- Homepage: https://talveth.github.io/pygrad/
- Size: 28 MB
Statistics
- Stars: 9
- Watchers: 1
- Forks: 0
- Open Issues: 0
- Releases: 0
Metadata Files
README.md
pygrad: A lightweight differentiation engine written in Python.
Documentation: https://talveth.github.io/pygrad/.
This is a lightweight (<300kB) automatic differentiation engine based on NumPy, Numba, and opt_einsum. Included is a differentiable Tensor class, layers such as Dropout/Linear/Attention, loss functions such as BCE/CCE, optimizers such as SGD/RMSProp/Adam, and an example DNN/CNN/Transformer architecture. This library is a good alternative if you want to do backpropagation on simple and small functions or networks, without much overhead.
The main component is the Tensor class supporting many math operations. Tensors have .value and .grad attributes, gradients being populated by calling .backward() on either self or any of its children. They can be used standalone, or for constructing more complex architectures such as a vanilla Transformer.
Installation
```bash pip install pygradproject
OR
git clone https://github.com/baubels/pygrad.git pip install . (or .[examples] or .[dev]) ```
Usage
Tensors accept the same input value as a NumPy array. Create them with Tensor(value) or tensor.array(value).
Run backprop on them with .backward().
A simple usage example:
python
from pygrad.tensor import Tensor
x = Tensor(1)
(((x**3 + x**2 + x + 1) - 1)**2).backward()
x.value, x.grad # 1.0, 36.0
Since Tensor store their value in .value and their gradient in .grad, it's easy to perform gradient descent.
python
for _ in range(100):
(((x**3 + x**2 + x + 1) - 1)**2).backward() # gradients are automatically reset when called
x.value = x.value - 0.01*x.grad
Tensors can also be operated on with broadcast-friendly NumPy arrays or other Tensors whose value is broadcast friendly. Internally, a Tensor will always cast it's set value to a NumPy array.
python
import numpy as np
x = Tensor(np.ones((10,20)))
y = Tensor(np.ones((20,10)))
z1 = x@y
z2 = x@np.ones((20,10))
np.all(z1.value == z2.value) # True
There are enough expressions defined to be able to create many different models. For example usage and in-depth descriptions of each component of pygrad, check out the docs.
Citation/Contribution
If you find this project helpful in your research or work, I kindly ask that you cite it: View Citation. Thank you!
If there are issues with the project, please submit an issue. Otherwise, please read the current status for contributors.
Owner
- Name: DSK
- Login: talveth
- Kind: user
- Repositories: 9
- Profile: https://github.com/talveth
Citation (CITATION.cff)
cff-version: 1.2.0 message: "If you use this software, please cite it as below." authors: - family-names: "Kurganov" given-names: "Danila" title: "pygrad" version: 0.0.1 date-released: 2024-10-25 url: "https://github.com/baubels/pygrad"
GitHub Events
Total
- Watch event: 1
- Push event: 2
Last Year
- Watch event: 1
- Push event: 2
Dependencies
- tqdm ==4.65.0
- numba *
- numpy *
- actions/checkout v4 composite
- actions/setup-python v4 composite
- ad-m/github-push-action master composite
- actions/checkout v3 composite
- actions/setup-python v4 composite