https://github.com/aidinhamedi/ml-optimizer-benchmark

A benchmark suite for optimization algorithms in pytorch

https://github.com/aidinhamedi/ml-optimizer-benchmark

Science Score: 26.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (9.3%) to scientific vocabulary

Keywords

benchmark deep-learning machine-learning ml optimization optimization-algorithms optimizer python python3 pytorch test
Last synced: 6 months ago · JSON representation

Repository

A benchmark suite for optimization algorithms in pytorch

Basic Info
Statistics
  • Stars: 2
  • Watchers: 0
  • Forks: 0
  • Open Issues: 0
  • Releases: 2
Topics
benchmark deep-learning machine-learning ml optimization optimization-algorithms optimizer python python3 pytorch test
Created 6 months ago · Last pushed 6 months ago
Metadata Files
Readme License

README.md

Optimizer Benchmark

A benchmarking suite for evaluating and comparing PyTorch optimization algorithms on 2D mathematical functions. This project uses pytorch_optimizer and Optuna for hyperparameter tuning, and generates visualizations of optimizer trajectories across optimization test functions.

[!WARNING] Important Limitations: These benchmark results are based on synthetic 2D functions and may not reflect real-world performance when training actual neural networks. The rankings should only be used as a reference, not as definitive guidance for choosing optimizers in practical applications.

🌶️ Features

  • Benchmarks most of the supported optimizers in pytorch_optimizer.
  • Hyperparameter search with Optuna (TPE sampler).
  • Visualization of optimization trajectories on:
    • Ackley
    • Cross-in-Tray
    • Drop-Wave
    • Eggholder
    • Gramacy & Lee (2D) (Not yet added to the results)
    • Griewank
    • Holder Table (Not yet added to the results)
    • Langermann
    • Lévy
    • Lévy 13 (Not yet added to the results)
    • Rastrigin
    • Rosenbrock
    • Schaffer 2
    • Schaffer 4 (Not yet added to the results)
    • Shubert
    • Styblinski–Tang
  • Configurable search spaces, iteration counts, ignored optimizers and... via config.toml.
  • Saves results and plots for later analysis.

🚀 Quick Start

```bash

Clone repository

git clone --depth 1 https://github.com/AidinHamedi/ML-Optimizer-Benchmark.git cd ML-Optimizer-Benchmark

Install dependencies

uv sync

Run the benchmark

python runner.py ```

The script will:

  1. Load settings from config.toml.
  2. Iterate through available optimizers.
  3. Run hyperparameter tuning with Optuna.
  4. Save results and visualizations under ./results/.

📊 Visualizations

Newest release 📦

Go to newest release

| Rank | Optimizer | Average Error Rate | Vis | |--------|--------------------|----------------------|-------------------------------------------------------------------------------------| | 1 | emonavi | 2.57147 | Open | | 2 | emofact | 2.8488 | Open | | 3 | emozeal | 2.8488 | Open | | 4 | yogi | 2.97492 | Open | | 5 | signsgd | 3.04044 | Open | | 6 | sophiah | 3.0733 | Open | | 7 | focus | 3.08895 | Open | | 8 | tiger | 3.10801 | Open | | 9 | stablespam | 3.60424 | Open | | 10 | soap | 3.69787 | Open | | 11 | apollo | 3.70088 | Open | | 12 | fira | 3.70088 | Open | | 13 | galore | 3.70088 | Open | | 14 | adam | 4.03249 | Open | | 15 | adanorm | 4.1164 | Open | | 16 | adamp | 4.18049 | Open | | 17 | adagc | 4.24922 | Open | | 18 | kron | 4.34267 | Open | | 19 | aida | 4.38717 | Open | | 20 | swats | 4.40458 | Open | | 21 | spam | 4.48192 | Open | | 22 | ademamix | 4.61658 | Open | | 23 | emolynx | 4.62471 | Open | | 24 | lion | 4.62471 | Open | | 25 | adatam | 4.80643 | Open | | 26 | adashift | 5.12098 | Open | | 27 | nadam | 5.13738 | Open | | 28 | adammini | 5.14225 | Open | | 29 | fromage | 5.19338 | Open | | 30 | ranger25 | 5.34414 | Open | | 31 | novograd | 5.36078 | Open | | 32 | adadelta | 5.4427 | Open | | 33 | adabelief | 5.46737 | Open | | 34 | adalite | 5.69652 | Open | | 35 | adasmooth | 5.71434 | Open | | 36 | exadam | 5.98366 | Open | | 37 | nero | 6.13398 | Open | | 38 | fadam | 6.3439 | Open | | 39 | adahessian | 6.50315 | Open | | 40 | laprop | 6.54265 | Open | | 41 | adapnm | 6.66368 | Open | | 42 | scionlight | 6.67052 | Open | | 43 | dadaptadan | 6.7595 | Open | | 44 | ranger21 | 7.04377 | Open | | 45 | rmsprop | 7.14312 | Open | | 46 | vsgd | 7.28179 | Open | | 47 | adopt | 7.40785 | Open | | 48 | adamax | 7.6868 | Open | | 49 | simplifiedademamix | 7.81188 | Open | | 50 | sm3 | 8.17468 | Open | | 51 | ftrl | 8.18237 | Open | | 52 | schedulefreeadamw | 8.27146 | Open | | 53 | asgd | 8.47956 | Open | | 54 | came | 8.70882 | Open | | 55 | lamb | 8.77041 | Open | | 56 | tam | 8.94983 | Open | | 57 | adan | 9.30751 | Open | | 58 | emoneco | 9.58197 | Open | | 59 | apollodqn | 9.63575 | Open | | 60 | padam | 10.4026 | Open | | 61 | diffgrad | 10.7386 | Open | | 62 | grokfastadamw | 10.9842 | Open | | 63 | lars | 12.0577 | Open | | 64 | racs | 12.0613 | Open | | 65 | sgd | 12.6166 | Open | | 66 | sgdp | 12.6166 | Open | | 67 | prodigy | 13.0301 | Open | | 68 | pid | 13.1682 | Open | | 69 | scion | 13.769 | Open | | 70 | scalableshampoo | 13.9738 | Open | | 71 | dadaptlion | 14.0853 | Open | | 72 | radam | 14.4628 | Open | | 73 | adabound | 14.5819 | Open | | 74 | avagrad | 15.9035 | Open | | 75 | qhm | 16.0966 | Open | | 76 | ranger | 16.7279 | Open | | 77 | pnm | 16.8118 | Open | | 78 | aggmo | 16.8834 | Open | | 79 | schedulefreesgd | 17.2629 | Open | | 80 | adai | 17.3909 | Open | | 81 | accsgd | 17.5586 | Open | | 82 | madgrad | 17.8124 | Open | | 83 | dadaptadagrad | 18.0346 | Open | | 84 | qhadam | 18.6783 | Open | | 85 | dadaptsgd | 18.9387 | Open | | 86 | gravity | 19.0526 | Open | | 87 | schedulefreeradam | 20.9976 | Open | | 88 | kate | 21.4434 | Open | | 89 | adamg | 21.4805 | Open | | 90 | adamod | 21.6752 | Open | | 91 | grams | 21.7464 | Open | | 92 | sgdsai | 21.7469 | Open | | 93 | adafactor | 22.9181 | Open | | 94 | mars | 26.4904 | Open | | 95 | shampoo | 28.3101 | Open | | 96 | srmm | 39.6709 | Open | | 97 | adams | 125.205 | Open |

🤝 Contributing

Contributions are welcome! Please feel free to submit a pull request or open an issue.

📚 References

  • Virtual Library of Simulation Experiments: Test Functions and Datasets for Optimization Algorithms. Source: Simon Fraser University https://www.sfu.ca/~ssurjano/optimization.html Curated by Derek Bingham — For inquiries: dbingham@stat.sfu.ca

  • Kim, H. (2021). pytorch_optimizer: optimizer & lr scheduler & loss function collections in PyTorch (Version 2.12.0) [Computer software]. https://github.com/kozistr/pytorch_optimizer

📝 License

 Copyright (c) 2025 Aidin Hamedi

 This software is released under the MIT License.
 https://opensource.org/licenses/MIT

Owner

  • Name: Aidin
  • Login: AidinHamedi
  • Kind: user

Segmentation fault

GitHub Events

Total
  • Push event: 46
  • Create event: 2
Last Year
  • Push event: 46
  • Create event: 2

Issues and Pull Requests

Last synced: 6 months ago


Dependencies

pyproject.toml pypi
  • matplotlib >=3.10.5
  • numpy >=2.3.2
  • pillow >=11.3.0
  • torch >=2.7.0
  • torchvision >=0.22.0
uv.lock pypi
  • babel 2.17.0
  • backrefs 5.9
  • certifi 2025.8.3
  • charset-normalizer 3.4.3
  • click 8.2.1
  • colorama 0.4.6
  • contourpy 1.3.3
  • cycler 0.12.1
  • filelock 3.18.0
  • fonttools 4.59.0
  • fsspec 2025.7.0
  • ghp-import 2.1.0
  • idna 3.10
  • jinja2 3.1.6
  • kiwisolver 1.4.9
  • markdown 3.8.2
  • markupsafe 3.0.2
  • matplotlib 3.10.5
  • mergedeep 1.3.4
  • mkdocs 1.6.1
  • mkdocs-get-deps 0.2.0
  • mkdocs-material 9.6.16
  • mkdocs-material-extensions 1.3.1
  • ml-optimizer-benchmark 0.1.0
  • mpmath 1.3.0
  • narwhals 2.1.0
  • networkx 3.5
  • numpy 2.3.2
  • packaging 25.0
  • paginate 0.5.7
  • pathspec 0.12.1
  • pillow 11.3.0
  • platformdirs 4.3.8
  • plotly 6.2.0
  • pygments 2.19.2
  • pymdown-extensions 10.16.1
  • pyparsing 3.2.3
  • python-dateutil 2.9.0.post0
  • pyyaml 6.0.2
  • pyyaml-env-tag 1.1
  • requests 2.32.4
  • setuptools 80.9.0
  • six 1.17.0
  • sympy 1.14.0
  • torch 2.8.0
  • torch 2.8.0+cpu
  • torchvision 0.23.0
  • torchvision 0.23.0+cpu
  • typing-extensions 4.14.1
  • urllib3 2.5.0
  • watchdog 6.0.0