neograd

A deep learning framework created from scratch with Python and NumPy

https://github.com/pranftw/neograd

Science Score: 54.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
    Links to: arxiv.org, zenodo.org
  • Committers with academic emails
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (14.8%) to scientific vocabulary

Keywords

ai autograd automatic-differentiation deep-learning deep-learning-framework machine-learning neural-network numpy python pytorch-api scratch-implementation
Last synced: 6 months ago · JSON representation ·

Repository

A deep learning framework created from scratch with Python and NumPy

Basic Info
Statistics
  • Stars: 237
  • Watchers: 4
  • Forks: 9
  • Open Issues: 0
  • Releases: 5
Topics
ai autograd automatic-differentiation deep-learning deep-learning-framework machine-learning neural-network numpy python pytorch-api scratch-implementation
Created over 3 years ago · Last pushed about 3 years ago
Metadata Files
Readme License Citation

README.md

Neograd

A Deep Learning framework created from scratch with Python and NumPy


image

Neograd Tests Documentation Status Downloads DOI

Get started

Installation

pip install neograd

PyPI

https://pypi.org/project/neograd/

Documentation

https://neograd.readthedocs.io/

Explore on Colab

https://colab.research.google.com/drive/1D4JgBwKgnNQ8Q5DpninB6rdFUidRbjwM?usp=sharing https://colab.research.google.com/drive/184916aB5alIyM_xCa0qWnZAL35fDa43L?usp=sharing

Motivation

I firmly believe that in order to understand something completely, you have to build it on your own from scratch. I used to do gradient calculation analytically, and thought that autograd was some kind of magic. So this was initially built to understand autograd but later on its scope was extended. You might be wondering, there are already many frameworks like TensorFlow and PyTorch that are very popular, and why did I have to create another one? The answer is that these have very complex codebases that are difficult to grasp. So I intend that this repository be used as an educational tool in order to understand how things work under the hood in these giant frameworks, with code that is intuitive and easily readable.

Features

Automatic Differentiation

autograd offers automatic differentiation, implemented for the most commonly required operations for vectors of any dimension, with broadcasting capabilities python import neograd as ng a = ng.tensor(3, requires_grad=True) b = ng.tensor([1,2,3], requires_grad=True) c = a+b c.backward([1,1,1]) print(a.grad) print(b.grad)

Custom autograd operations

If you wanted a custom operation to have autograd capabilities, those can be defined with very simple interface each having a forward method and a backward method python class Custom(Operation): def forward(self): pass def backward(self): pass

Gradient Checking

Debug your models/functions with Gradient Checking, to ensure that the gradients are getting propagated correctly

Highly customizable

Create your own custom layers, optimizers, loss functions which provides more flexibility to create anything you desire

PyTorch like API

PyTorch's API is one of the best and one the most elegant API designs, so we've leveraged the same

Neural Network module

nn contains some of the most commonly used optimizers, activations and loss functions required to train a Neural Network

Save and Load weights, model

Trained a model already? Then save the weights onto a file and load them whenever required or save the entire model, onto a file

Checkpoints

Let's say you're training a model and your computer runs out of juice and if you'd waited until training was finished, to save the weights, then you'd lose all the weights. To prevent this, checkpoint your model with various sessions to save the weights during regular intervals with additional supporting data

Example

```python import neograd as ng from neograd import nn import numpy as np from neograd.nn.loss import BCE from neograd.nn.optim import Adam from neograd.autograd.utils import gradcheck from sklearn.datasets import makecircles from sklearn.modelselection import traintestsplit from sklearn.metrics import classificationreport, accuracy_score

load dataset (binary classification problem)

X, y = makecircles(nsamples=1000, noise=0.05, randomstate=100) Xtrain, Xtest, ytrain, ytest = traintest_split(X,y)

numtrain = 750 # number of train examples numtest = 250 # number of test examples num_iter = 50 # number of training iterations

convert data into tensors

Xtrain, Xtest = ng.tensor(Xtrain[:numtrain,:]), ng.tensor(Xtest[:numtest,:]) ytrain, ytest = ng.tensor(ytrain[:numtrain].reshape(numtrain,1)), ng.tensor(ytest[:numtest].reshape(numtest,1))

define the structure of your neural net

class NN(nn.Model): def init(self): self.stack = nn.Sequential( nn.Linear(2,100), nn.ReLU(), nn.Linear(100,1), nn.Sigmoid() )

def forward(self, inputs): return self.stack(inputs)

model = NN() # initialize a model loss_fn = BCE() # initialize a loss function (Binary Cross Entropy) optim = Adam(model.parameters(), 0.05) # initialize an optimizer

training loop

for i in range(numiter): optim.zerograd() # zero out the gradients in the tensors outputs = model(Xtrain) # get the outputs by passing the training data to your model loss = lossfn(outputs, ytrain) # calculate the loss loss.backward() # initiate the backward pass to calculate the gradients optim.step() # update the parameters print(f"iter {i+1}/{numiter}\nloss: {loss}\n")

with model.eval(): # put the model in evaluation mode testoutputs = model(Xtest) # get the outputs of the model on test data preds = np.where(test_outputs.data>=0.5, 1, 0) # make predictions

print(classificationreport(ytest.data.astype(int).flatten(), preds.flatten())) print(accuracyscore(ytest.data.astype(int).flatten(), preds.flatten()))

gradcheck(model, Xtrain, ytrain, lossfn) # perform gradient checking in your model ```

How is this any different from

  • Andrej Karpathy's micrograd
    Natively only supports scalar values for computation, whereas we support scalars, vectors, matrices all compatible with NumPy broadcasting
  • George Hotz's tinygrad
    Has an obligation to be under 1000 lines of code leading to cramped up code, therefore our implementation is so much more readable and easily understandable. Also, no dealing with C/C++ code used in tinygrad for GPU acceleration
  • pytorch, tensorflow, etc
    Large messy codebases written mostly in C/C++ for efficiency making it impossible to find you're way around and understand stuff. We've a pure Python implementation making it easy to get started and understand what's going on under the hood

Resources

  • A big thank you to Andrej Karpathy for his CS231n lecture on Backpropagation which was instrumental in helping me gain a good grasp of the basic mechanisms of autograd
  • Thanks to Terance Parr and Jeremy Howard for their paper The Matrix Calculus You Need For Deep Learning which helped me get rid of my fear for matrix calculus, that is beautifully written starting from the very fundamentals and slowly transitioning into advanced topics

Owner

  • Name: pranav
  • Login: pranftw
  • Kind: user
  • Location: Bengaluru

Deep learner.

Citation (CITATION.cff)

cff-version: 1.1.0
message: "If you use this software, please cite it as below."
authors:
  - family-names: Sastry
    given-names: Pranav
    orcid: https://orcid.org/0000-0003-2091-3790
title: neograd - A Deep Learning framework created from scratch using Python and NumPy
doi: 10.5281/zenodo.7387379
date-released: 2022-09-16

GitHub Events

Total
  • Watch event: 5
Last Year
  • Watch event: 5

Committers

Last synced: almost 3 years ago

All Time
  • Total Commits: 176
  • Total Committers: 1
  • Avg Commits per committer: 176.0
  • Development Distribution Score (DDS): 0.0
Top Committers
Name Email Commits
Pranav Sastry p****i@g****m 176

Issues and Pull Requests

Last synced: 8 months ago

All Time
  • Total issues: 0
  • Total pull requests: 0
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Total issue authors: 0
  • Total pull request authors: 0
  • Average comments per issue: 0
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 0
  • Pull requests: 0
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Issue authors: 0
  • Pull request authors: 0
  • Average comments per issue: 0
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
Pull Request Authors
Top Labels
Issue Labels
Pull Request Labels

Packages

  • Total packages: 1
  • Total downloads:
    • pypi 17 last-month
  • Total dependent packages: 0
  • Total dependent repositories: 0
  • Total versions: 4
  • Total maintainers: 1
pypi.org: neograd

A deep learning framework created from scratch with Python and NumPy

  • Versions: 4
  • Dependent Packages: 0
  • Dependent Repositories: 0
  • Downloads: 17 Last month
Rankings
Stargazers count: 6.4%
Dependent packages count: 6.6%
Forks count: 15.7%
Average: 17.6%
Downloads: 28.9%
Dependent repos count: 30.6%
Maintainers (1)
Last synced: 7 months ago

Dependencies

requirements.txt pypi
  • dill *
  • numpy *
tests/requirements.txt pypi
  • numpy * test
  • pytest * test
.github/workflows/python-app.yml actions
  • actions/checkout v3 composite
  • actions/setup-python v4 composite