nadir

Nadir: Cutting-edge PyTorch optimizers for simplicity & composability! 🔥🚀💻

https://github.com/optimalfoundation/nadir

Science Score: 54.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • ✓
    CITATION.cff file
    Found CITATION.cff file
  • ✓
    codemeta.json file
    Found codemeta.json file
  • ✓
    .zenodo.json file
    Found .zenodo.json file
  • â—‹
    DOI references
  • ✓
    Academic publication links
    Links to: arxiv.org
  • â—‹
    Committers with academic emails
  • â—‹
    Institutional organization owner
  • â—‹
    JOSS paper metadata
  • â—‹
    Scientific vocabulary similarity
    Low similarity (14.9%) to scientific vocabulary

Keywords

adabelief adam-optimizer adamax adamp adamw amsgrad lion machine-learning optimization pytorch radam sgd-optimizer
Last synced: 4 months ago · JSON representation ·

Repository

Nadir: Cutting-edge PyTorch optimizers for simplicity & composability! 🔥🚀💻

Basic Info
  • Host: GitHub
  • Owner: OptimalFoundation
  • License: apache-2.0
  • Language: Python
  • Default Branch: main
  • Homepage: https://nadir.rtfd.io
  • Size: 22.8 MB
Statistics
  • Stars: 14
  • Watchers: 1
  • Forks: 3
  • Open Issues: 13
  • Releases: 5
Topics
adabelief adam-optimizer adamax adamp adamw amsgrad lion machine-learning optimization pytorch radam sgd-optimizer
Created almost 3 years ago · Last pushed over 1 year ago
Metadata Files
Readme Funding Code of conduct Citation Roadmap

README.md

NADIRbanner2

Nadir

Downloads GitHub commit activity Documentation Status GitHub Repo stars

Nadir (pronounced nay-di-ah) is derived from the arabic word nazir, and means "the lowest point of a space". In optimisation problems, it is equivalent to the point of minimum. If you are a machine learning enthusiast, a data scientist or an AI practitioner, you know how important it is to use the best optimization algorithms to train your models. The purpose of this library is to help optimize machine learning models and enable them to reach the point of nadir in the appropriate context.

Nadir follows the principles of Simplicity, Modularity and Composabilty. Read more in the Core Philosophy section.

Table of Contents

Installation

You can either choose to install from the PyPI index, in the following manner:

bash $ pip install nadir or install from source, in the following manner:

bash $ pip install git+https://github.com/OptimalFoundation/nadir.git Note: Installing from source might lead to a breaking package. It is recommended that you install from PyPI itself.

Simple Usage

```python import nadir as nd

some model setup here...

model = ...

set up your Nadir optimiser

config = nd.SGDConfig(lr=learning_rate) optimizer = nd.SGD(model.parameters(), config)

Call the optimizer step

optimizer.step() ```

Core Philosophy

Nadir was built to provide a sense of uniformity and integration that might be lacking in the optimisation community, based on the simple idea that optimisers are not islands. They are usually inheriting characteristics from other optimisers and they provide inspiration to other optimisers. So why not make optimisers inheritable, composible and modular objects?

The core concepts that each optimiser in Nadir follows are:

  1. Simplicity is of key importance. We prefer readability and simplicity over performance. Experiment, test and verify what works and what does not with Nadir. Optimise and write custom fused kernels for your favorite optimisers after, for performance.

  2. Modularity means that the each new optimiser should minimise on the extra new logic added by adding or editing only the parts that need editing. If you want to have a different momentum in Adam, you only change the function of Momentum after inheriting Adam. No need to write the entire code from scratch.

  3. Composibility implies that we can take things from one optimiser and add them to another without much effort. You can build a optimiser that is the mix of RAdam and NAdam with the properties of AdaBelief, if you so desire! That's what makes this library really powerful.

Supported Optimisers

| Optimiser | Paper | |:---------: |:-----: | | SGD | https://paperswithcode.com/method/sgd | | Momentum | https://paperswithcode.com/method/sgd-with-momentum | | NAG | https://jlmelville.github.io/mize/nesterov.html | | Adagrad | https://www.jmlr.org/papers/volume12/duchi11a/duchi11a.pdf | | RMSProp | https://paperswithcode.com/method/rmsprop | | Adam | https://arxiv.org/abs/1412.6980v9 | | Adamax | https://arxiv.org/abs/1412.6980v9 | | AdamW | https://arxiv.org/abs/1711.05101v3 | | Adadelta | https://arxiv.org/abs/1212.5701v1 | | AMSGrad | https://arxiv.org/abs/1904.09237v1 | | RAdam | https://arxiv.org/abs/1908.03265v4 | | Lion | https://arxiv.org/abs/2302.06675 | | AdaBelief| https://arxiv.org/pdf/2010.07468v5.pdf | | NAdam | http://cs229.stanford.edu/proj2015/054_report.pdf |

Acknowledgements

We would like to thank all the amazing contributors of this project who spent so much effort making this repositary awesome! :heart:

Citation

You can use the Cite this repository button provided by Github or use the following bibtex:

bibtex @software{MinhasNadir, title = {{Nadir: A Library for Bleeding-Edge Optimizers in PyTorch}}, author = {Minhas, Bhavnick and Kalathukunnel, Apsal}, year = 2023, month = 3, version = {0.0.2} }

Owner

  • Name: Optimal Foundation Inc.
  • Login: OptimalFoundation
  • Kind: organization

Citation (CITATION.cff)

# This CITATION.cff file was generated with cffinit.
# Visit https://bit.ly/cffinit to generate yours today!

cff-version: 1.2.0
title: Dawn of Eve
message: >-
  If you are using this research in your own work or
  any code from the repository, please cite us!
type: software
authors:
  - given-names: Bhavnick
    family-names: Minhas
    email: bhavnicksm@gmaill.com
    
  - given-names: Apsal
    family-names: Kalathukunnel
    email: apsalshbk550@gmail.com

GitHub Events

Total
  • Watch event: 1
Last Year
  • Watch event: 1

Committers

Last synced: over 1 year ago

All Time
  • Total Commits: 152
  • Total Committers: 6
  • Avg Commits per committer: 25.333
  • Development Distribution Score (DDS): 0.546
Past Year
  • Commits: 28
  • Committers: 2
  • Avg Commits per committer: 14.0
  • Development Distribution Score (DDS): 0.321
Top Committers
Name Email Commits
bhavnicksm b****m@g****m 69
Bhavnick Minhas 1****m 31
Bhavnick Yali b****k@y****i 26
apsal1 a****l@y****i 17
Apsal S Kalathukunnel 4****l 7
nithinkr2000 6****0 2
Committer Domains (Top 20 + Academic)
yali.ai: 2

Packages

  • Total packages: 1
  • Total downloads:
    • pypi 38 last-month
  • Total dependent packages: 0
  • Total dependent repositories: 0
  • Total versions: 7
  • Total maintainers: 1
pypi.org: nadir

Nadir: Cutting-edge PyTorch optimizers for simplicity & composability! 🔥🚀💻

  • Versions: 7
  • Dependent Packages: 0
  • Dependent Repositories: 0
  • Downloads: 38 Last month
Rankings
Dependent packages count: 6.6%
Downloads: 12.1%
Forks count: 17.3%
Average: 18.4%
Stargazers count: 25.5%
Dependent repos count: 30.6%
Maintainers (1)
Last synced: 5 months ago