DeepHyper

DeepHyper: A Python Package for Massively Parallel Hyperparameter Optimization in Machine Learning - Published in JOSS (2025)

https://github.com/deephyper/deephyper

Science Score: 95.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
    Found 7 DOI reference(s) in README and JOSS metadata
  • Academic publication links
    Links to: joss.theoj.org
  • Committers with academic emails
    11 of 36 committers (30.6%) from academic institutions
  • Institutional organization owner
  • JOSS paper metadata
    Published in Journal of Open Source Software

Keywords

automl deep-learning hpc hyperparameter-optimization machine-learning mpi multi-fidelity neural-architecture-search python pytorch raylib scalability uncertainty-quantification
Last synced: 4 months ago · JSON representation

Repository

DeepHyper: A Python Package for Massively Parallel Hyperparameter Optimization in Machine Learning

Basic Info
Statistics
  • Stars: 296
  • Watchers: 12
  • Forks: 62
  • Open Issues: 14
  • Releases: 25
Topics
automl deep-learning hpc hyperparameter-optimization machine-learning mpi multi-fidelity neural-architecture-search python pytorch raylib scalability uncertainty-quantification
Created about 7 years ago · Last pushed 4 months ago
Metadata Files
Readme Contributing License

README.md

DOI GitHub tag (latest by date) Documentation Status License PyPI - Downloads

DeepHyper: A Python Package for Massively Parallel Hyperparameter Optimization in Machine Learning

DeepHyper is first and foremost a hyperparameter optimization (HPO) library. By leveraging this core HPO functionnality, DeepHyper also provides neural architecture search, multi-fidelity and ensemble capabilities. With DeepHyper, users can easily perform these tasks on a single machine or distributed across multiple machines, making it ideal for use in a variety of environments. Whether you’re a beginner looking to optimize your machine learning models or an experienced data scientist looking to streamline your workflow, DeepHyper has something to offer. So why wait? Start using DeepHyper today and take your machine learning skills to the next level!

Installation

Installation with pip:

console pip install deephyper

More details about the installation process can be found in our Installation documentation.

Quickstart

The black-box function named run is defined by taking an input job named job which contains the different variables to optimize job.parameters. Then the run-function is bound to an Evaluator in charge of distributing the computation of multiple evaluations. Finally, a Bayesian search named CBO is created and executed to find the values of config which MAXIMIZE the return value of run(job).

```python from deephyper.hpo import HpProblem, CBO from deephyper.evaluator import Evaluator

def run(job): x = job.parameters["x"] b = job.parameters["b"] function = job.parameters["function"]

if function == "linear":
    y = x + b
elif function == "cubic":
    y = x**3 + b

return y

def optimize(): problem = HpProblem() problem.addhyperparameter((-10.0, 10.0), "x") problem.addhyperparameter((0, 10), "b") problem.add_hyperparameter(["linear", "cubic"], "function")

evaluator = Evaluator.create(run, method="process",
    method_kwargs={
        "num_workers": 2,
    },
)

search = CBO(
    problem, 
    evaluator, 
    random_state=42, 
    solution_selection="argmax_obs",
)
results = search.search(max_evals=100)

return results

if name == "main": results = optimize() print(results)

row = results.iloc[-1]
print("\nOptimum values")
print("function:", row["sol.p:function"])
print("x:", row["sol.p:x"])
print("b:", row["sol.p:b"])
print("y:", row["sol.objective"])

```

Which outputs the following results where the best parameters are with function == "cubic", x == 9.99 and b == 10.

```verbatim p:b p:function p:x objective jobid jobstatus m:timestampsubmit m:timestampgather sol.p:b sol.p:function sol.p:x sol.objective 0 7 cubic -1.103350 5.656803 0 DONE 0.011795 0.905777 3 cubic 8.374450 590.312101 1 3 cubic 8.374450 590.312101 1 DONE 0.011875 0.906027 3 cubic 8.374450 590.312101 2 6 cubic 4.680560 108.540056 2 DONE 0.917542 0.918856 3 cubic 8.374450 590.312101 3 9 linear 8.787395 17.787395 3 DONE 0.917645 0.929052 3 cubic 8.374450 590.312101 4 6 cubic 9.109560 761.948419 4 DONE 0.928757 0.938856 6 cubic 9.109560 761.948419 .. ... ... ... ... ... ... ... ... ... ... ... ... 96 9 cubic 9.998937 1008.681250 96 DONE 33.905465 34.311504 10 cubic 9.999978 1009.993395 97 10 cubic 9.999485 1009.845416 97 DONE 34.311124 34.777270 10 cubic 9.999978 1009.993395 98 10 cubic 9.996385 1008.915774 98 DONE 34.776732 35.236710 10 cubic 9.999978 1009.993395 99 10 cubic 9.997400 1009.220073 99 DONE 35.236190 35.687774 10 cubic 9.999978 1009.993395 100 10 cubic 9.999833 1009.949983 100 DONE 35.687380 36.111318 10 cubic 9.999978 1009.993395

[101 rows x 12 columns]

Optimum values function: cubic x: 9.99958232225758 b: 10 y: 1009.8747019108424 ```

More details about this example can be found in our Quick Start documentation.

How do I learn more?

Check out our online documentation with API reference and examples: https://deephyper.readthedocs.io

Citing DeepHyper

To cite this repository:

@article{Egele2025, doi = {10.21105/joss.07975}, url = {https://doi.org/10.21105/joss.07975}, year = {2025}, publisher = {The Open Journal}, volume = {10}, number = {109}, pages = {7975}, author = {Romain Egele and Prasanna Balaprakash and Gavin M. Wiggins and Brett Eiffert}, title = {DeepHyper: A Python Package for Massively Parallel Hyperparameter Optimization in Machine Learning}, journal = {Journal of Open Source Software} }

How can I participate?

Questions, comments, feature requests, bug reports, etc. can be directed to Github Issues.

Patches through pull requests are much appreciated on the software itself as well as documentation.

More documentation about how to contribute is available on deephyper.readthedocs.io/en/latest/developer_guides/contributing.html.

Acknowledgments

  • Scalable Data-Efficient Learning for Scientific Domains, U.S. Department of Energy 2018 Early Career Award funded by the Advanced Scientific Computing Research program within the DOE Office of Science (2018--Present)
  • Argonne Leadership Computing Facility: This research used resources of the Argonne Leadership Computing Facility, which is a DOE Office of Science User Facility supported under Contract DE-AC02-06CH11357.
  • SLIK-D: Scalable Machine Learning Infrastructures for Knowledge Discovery, Argonne Computing, Environment and Life Sciences (CELS) Laboratory Directed Research and Development (LDRD) Program (2016--2018)

Copyright and license

Copyright © 2019, UChicago Argonne, LLC

DeepHyper is distributed under the terms of BSD License. See LICENSE

Argonne Patent & Intellectual Property File Number: SF-19-007

Owner

  • Name: DeepHyper Team
  • Login: deephyper
  • Kind: organization

JOSS Publication

DeepHyper: A Python Package for Massively Parallel Hyperparameter Optimization in Machine Learning
Published
May 19, 2025
Volume 10, Issue 109, Page 7975
Authors
Romain Egele ORCID
Oak Ridge National Laboratory, Oak Ridge, TN, United States
Prasanna Balaprakash ORCID
Oak Ridge National Laboratory, Oak Ridge, TN, United States
Gavin M. Wiggins
Oak Ridge National Laboratory, Oak Ridge, TN, United States
Brett Eiffert
Oak Ridge National Laboratory, Oak Ridge, TN, United States
Editor
Mehmet Hakan Satman ORCID
Tags
machine learning hyperparameter optimization multi-fidelity neural architecture search ensemble high-performance computing

GitHub Events

Total
  • Create event: 71
  • Release event: 6
  • Issues event: 100
  • Watch event: 15
  • Delete event: 63
  • Issue comment event: 204
  • Push event: 356
  • Pull request review comment event: 18
  • Pull request review event: 23
  • Pull request event: 112
  • Fork event: 3
Last Year
  • Create event: 71
  • Release event: 6
  • Issues event: 100
  • Watch event: 15
  • Delete event: 63
  • Issue comment event: 204
  • Push event: 356
  • Pull request review comment event: 18
  • Pull request review event: 23
  • Pull request event: 112
  • Fork event: 3

Committers

Last synced: 7 months ago

All Time
  • Total Commits: 2,717
  • Total Committers: 36
  • Avg Commits per committer: 75.472
  • Development Distribution Score (DDS): 0.19
Past Year
  • Commits: 297
  • Committers: 3
  • Avg Commits per committer: 99.0
  • Development Distribution Score (DDS): 0.061
Top Committers
Name Email Commits
Deathn0t r****e@g****m 2,200
msalim m****m@a****v 155
Prasanna p****h@g****m 134
minesweeter j****g@g****m 71
Dipendra Jha d****9@g****m 29
Kyle Gerard Felker f****r@a****v 15
Gavin Wiggins 6****g 13
Matthieu Dorier m****r@a****v 12
felixeperez 3****z 12
Bethany Lusch b****h@a****v 9
Romit Maulik r****k@t****v 7
Bethany Lusch 9****L 7
Romit Maulik r****k@a****v 6
Tyler H Chang t****g@a****v 5
Brett Eiffert b****t@g****m 5
Yixuan Sun y****e@g****m 4
Shengli Jiang 4****7 3
Romain Egele r****e@a****v 3
romain egele r****e@m****e 3
Albert Lam a****3@h****m 2
John Doe r****0@g****m 2
Taylor Childers t****s@g****m 2
Z223I 3****I 2
Zach 2****l 2
Denis Boyda b****d@m****u 2
Romit Maulik r****k@t****v 2
Akalanka 8****g 1
Hongyuan Liu l****y@g****m 1
romain egele r****e@r****e 1
Romain Egele r****e@n****v 1
and 6 more...

Issues and Pull Requests

Last synced: 4 months ago

All Time
  • Total issues: 145
  • Total pull requests: 157
  • Average time to close issues: 6 months
  • Average time to close pull requests: 6 days
  • Total issue authors: 36
  • Total pull request authors: 19
  • Average comments per issue: 1.48
  • Average comments per pull request: 1.56
  • Merged pull requests: 129
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 54
  • Pull requests: 110
  • Average time to close issues: 27 days
  • Average time to close pull requests: 1 day
  • Issue authors: 10
  • Pull request authors: 4
  • Average comments per issue: 0.96
  • Average comments per pull request: 1.65
  • Merged pull requests: 87
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • Deathn0t (93)
  • wigging (4)
  • y0z (3)
  • megh1241 (3)
  • felker (3)
  • robertu94 (2)
  • theSubsurfaceGuy (2)
  • OliVandy (2)
  • davdma (2)
  • evvaletov (2)
  • bretteiffert (2)
  • jungtaekkim (2)
  • jinz2014 (2)
  • sibyjackgrove (1)
  • nviz2 (1)
Pull Request Authors
  • Deathn0t (62)
  • wigging (32)
  • minesweeter (19)
  • bretteiffert (18)
  • thchang (6)
  • iamyixuan (4)
  • evvaletov (2)
  • albertkklam (2)
  • jbytecode (2)
  • sjiang87 (2)
  • boydad (2)
  • Sande33p (1)
  • jorgectf (1)
  • boneyag (1)
  • mdorier (1)
Top Labels
Issue Labels
docs (35) enhancement (31) bug (30) question (9)
Pull Request Labels

Packages

  • Total packages: 1
  • Total downloads:
    • pypi 3,799 last-month
  • Total dependent packages: 0
  • Total dependent repositories: 3
  • Total versions: 40
  • Total maintainers: 3
pypi.org: deephyper

Massively Parallel Hyperparameter Optimization for Machine Learning

  • Documentation: http://deephyper.readthedocs.io
  • License: BSD 3-Clause License Copyright (c) 2018, UChicago Argonne, LLC and the DeepHyper Development Team All Rights Reserved Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. 3. Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
  • Latest release: 0.11.0
    published 4 months ago
  • Versions: 40
  • Dependent Packages: 0
  • Dependent Repositories: 3
  • Downloads: 3,799 Last month
Rankings
Stargazers count: 4.3%
Forks count: 5.6%
Dependent packages count: 7.4%
Average: 7.9%
Dependent repos count: 9.2%
Downloads: 13.0%
Maintainers (3)
Last synced: 4 months ago

Dependencies

.github/workflows/ci.yml actions
  • actions/checkout v3 composite
  • actions/setup-python v3 composite
  • codecov/codecov-action v1 composite
Dockerfile docker
  • continuumio/miniconda3 latest build
install/requirements.dev.macOS.arm64.txt pypi
  • jupyter * development
  • tensorflow-probability ==0.14.1 development
install/requirements.macOS.arm64.txt pypi
  • tensorflow-probability ==0.14.1
install/macOS/requirements.macOS.arm64.txt pypi
pyproject.toml pypi
setup.py pypi