DeepHyper
DeepHyper: A Python Package for Massively Parallel Hyperparameter Optimization in Machine Learning - Published in JOSS (2025)
PyXAB - A Python Library for $\mathcal{X}$-Armed Bandit and Online Blackbox Optimization Algorithms
PyXAB - A Python Library for $\mathcal{X}$-Armed Bandit and Online Blackbox Optimization Algorithms - Published in JOSS (2024)
BayesO
BayesO: A Bayesian optimization framework in Python - Published in JOSS (2023)
ECabc
ECabc: A feature tuning program focused on Artificial Neural Network hyperparameters - Published in JOSS (2019)
Osprey
Osprey: Hyperparameter Optimization for Machine Learning - Published in JOSS (2016)
flaml
A fast library for AutoML and tuning. Join our Discord: https://discord.gg/Cppx2vSPVP.
smac
SMAC3: A Versatile Bayesian Optimization Package for Hyperparameter Optimization
tpot
A Python Automated Machine Learning tool that optimizes machine learning pipelines using genetic programming.
ray
Ray is an AI compute engine. Ray consists of a core distributed runtime and a set of AI Libraries for accelerating ML workloads.
neural-pipeline-search
Neural Pipeline Search (NePS): Helps deep learning experts find the best neural pipeline.
https://github.com/epistasislab/tpot2
A Python Automated Machine Learning tool that optimizes machine learning pipelines using genetic programming.
polyaxon
MLOps Tools For Managing & Orchestrating The Machine Learning LifeCycle
mljar-supervised
Python package for AutoML on Tabular Data with Feature Engineering, Hyper-Parameters Tuning, Explanations and Automatic Documentation
metasklearn
MetaSklearn: A Metaheuristic-Powered Hyperparameter Optimization Framework for Scikit-Learn Models.
Robyn
Robyn is an experimental, AI/ML-powered and open sourced Marketing Mix Modeling (MMM) package from Meta Marketing Science. Our mission is to democratise modeling knowledge, inspire the industry through innovation, reduce human bias in the modeling process & build a strong open source marketing science community.
https://github.com/polyaxon/hypertune
A library for performing hyperparameter optimization
hyperas
Keras + Hyperopt: A very simple wrapper for convenient hyperparameter optimization
https://github.com/project-codeflare/codeflare
Simplifying the definition and execution, scaling and deployment of pipelines on the cloud.
https://github.com/bbopt/hypernomad
A library for the hyperparameter optimization of deep neural networks
syne-tune
Large scale and asynchronous Hyperparameter and Architecture Optimization at your fingertips.
dpl
[NeurIPS 2023] Multi-fidelity hyperparameter optimization with deep power laws that achieves state-of-the-art results across diverse benchmarks.
mlp_hpp_analysis
This repository is the code basis for the paper intitled "Exploring the Intricacies of Neural Network Optimization"
propulate
Propulate is an asynchronous population-based optimization algorithm and software package for global optimization and hyperparameter search on high-performance computers.
https://github.com/ahmedshahriar/customer-churn-prediction
Extensive EDA of the IBM telco customer churn dataset, implemented various statistical hypotheses tests and Performed single-level Stacking Ensemble and tuned hyperparameters using Optuna.
dpl
[NeurIPS 2023] Multi-fidelity hyperparameter optimization with deep power laws that achieves state-of-the-art results across diverse benchmarks.
dashai
DashAI provides a simple graphical user interface (GUI) that guides users through a step-by-step process through creating, training, and saving a model.
agilerl
Streamlining reinforcement learning with RLOps. State-of-the-art RL algorithms and tools, with 10x faster training through evolutionary hyperparameter optimization.
bayesian-hyper-parameter-optimization-for-malware-detection
AI-CyberSec 2021 Workshop CEUR Publication(AI-2021 Forty-first SGAI International Conference)
https://github.com/machinelearningnuremberg/deeprankingensembles
[ICLR 2023] Deep Ranking Ensembles for Hyperparameter Optimization
lfads-torch
A PyTorch implementation of Latent Factor Analysis via Dynamical Systems (LFADS) and AutoLFADS.
https://github.com/asreview/paper-megameta-hyperparameter-training
Hyperparameter-training for the Mega-Meta project
https://github.com/axect/pytorch_template
A flexible PyTorch template for ML experiments with configuration management, logging, and hyperparameter optimization.