https://github.com/bbopt/hypernomad

A library for the hyperparameter optimization of deep neural networks

https://github.com/bbopt/hypernomad

Science Score: 10.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
  • .zenodo.json file
  • DOI references
  • Academic publication links
    Links to: arxiv.org, zenodo.org
  • Committers with academic emails
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (14.1%) to scientific vocabulary

Keywords

blackbox-optimization categorical-variables deep-neural-networks hyperparameter-optimization hyperparameter-tuning hyperparameters neural-architecture-search nomad optimization python pytorch
Last synced: 5 months ago · JSON representation

Repository

A library for the hyperparameter optimization of deep neural networks

Basic Info
  • Host: GitHub
  • Owner: bbopt
  • License: lgpl-3.0
  • Language: C++
  • Default Branch: master
  • Homepage:
  • Size: 950 KB
Statistics
  • Stars: 17
  • Watchers: 5
  • Forks: 1
  • Open Issues: 1
  • Releases: 0
Topics
blackbox-optimization categorical-variables deep-neural-networks hyperparameter-optimization hyperparameter-tuning hyperparameters neural-architecture-search nomad optimization python pytorch
Created about 7 years ago · Last pushed about 4 years ago
Metadata Files
Readme License

README.md


Hyperparameter optimization of deep neural networks with HyperNOMAD


HyperNOMAD is a C++ and Python package dedicated to the hyperparameter optimization of deep neural networks. The package contains a blackbox specifically designed for this problematic and provides a link with the NOMAD software used for the optimization. The blackbox takes as inputs a list of hyperparameters, builds a corresponding deep neural network in order to train, validate and test it on a specific data set before returning the test error as a mesure of performance. NOMAD is then used to minimize this error. The following appendix provides an overview of how to use the HyperNOMAD package.

The following tutorial shows the different steps to take in order to run HyperNOMAD on a first example. The complete functionalities of HyperNOMAD are described in the documentation.

Prerequisites

In order to run HyperNOMAD correctly, please make sure to have:

  • A compiled version of NOMAD 3. Please note that HyperNomad is not compatible with NOMAD 4.
  • Python > 3.6
  • PyTorch
  • GCC > 9.0

installation of HyperNOMAD

First build the executable by running the following command.

``` make

building HyperNOMAD ...

To be able to run the example
the HYPERNOMAD_HOME environment variable
must be set to the HyperNOMAD home directory

```

When the compilation is successful, a message appears asking to set an environment variable 'HYPERNOMAD_HOME'. This can be done by adding a line in the file .profile or .bashrc :

export HYPERNOMAD_HOME=hypernomad_directory

The executable hypernomad.exe is located in the bin directory. You can check that the installation is successful by trying to run the commad

$HYPERNOMAD_HOME/bin/./hypernomad.exe -i

which should return the following informations:

``` -------------------------------------------------- HyperNomad - version 1.0 -------------------------------------------------- Using Nomad version 3.9.0 - www.gerad.ca/nomad --------------------------------------------------

Run           : hypernomad.exe parameters_file
Info          : hypernomad.exe -i
Help          : hypernomad.exe -h
Version       : hypernomad.exe -v
Usage         : hypernomad.exe -u
Neighboors    : hypernomad.exe -n parameters_file

```

Getting started

The next phase is to create a parameter file that contains the necessary informations to specify the classification problem, the search space and the initial starting point. HyperNOMAD allows for a good flexibility of tuning a convolutional network by considering multiple aspects of a network at once such as the architecture, the dropout rate, the choice of the optimizer and the hyperparameters related to the optimization aspect (learning rate, weight decay, momentum, ...), the batch size, etc. The user can choose to optimize all these aspects or select a few and fixe the others to certain values. The user can also change the default range of each hyperparameter.

This information is passed through the parameter file by using a specific synthax:

KEYWORD INITIAL_VALUE LOWER_BOUND UPPER_BOUND FIXED/VAR

Here is an example of an acceptable parameter file. First, the dataset MNIST is choosen and we specify that HyperNOMAD is allowed to try a maximum of 100 configurations. Then, the number of convolutional layers is fixed throught the optimization to 5, the two '-' appearing after the '5' mean that the default lower and upper bounds are not changed. The kernels, number of fully connected layers and activation function are respectively initialized at 3, 6, and 2 (Sigmoid) and the dropout rate is initialized at 0.6 with a new lower bound of 0.3 and upper bound of 0.8

Finally, all the remaining hyperparameters that are not explicitly mentioned in this file are fixed to their default values during the optimization.

``` DATASET MNIST MAXBBEVAL 100

Optional information

NUMCONLAYERS 5 - - FIXED # The initial value is fixed # lower and upper bounds have no # influence when parameter # is fixed.

KERNELS 3 # Only the initial value is set (not fixed) # the lower bound and upper bound # have default values.

NUMFCLAYERS 6 ACTIVATIONFUNCTION 2 DROPOUTRATE 0.6 0.3 0.8 # The lower and upper bounds # are set to values that are not # the default ones REMAINING_HPS FIXED ```

More details are provided in the user guide section of the documentation.

Running an optimization

The optimization starts by executing the command:

$HYPERNOMAD_HOME/bin/./hypernomad.exe parameter_file.txt

Multiple examples of parameter files are provided in the folder examples. One uses CIFAR-10 starting from the default starting point and the other use MNIST with different configurations.

To use these files, the cammand is:

$HYPERNOMAD_HOME/bin/./hypernomad.exe $HYPERNOMAD_HOME/examples/cifar10_default.txt or

$HYPERNOMAD_HOME/bin/./hypernomad.exe $HYPERNOMAD_HOME/examples/mnist_fc_optim.txt

Citing HyperNOMAD

If you use HyperNOMAD, please cite the following paper.

```

@article{lakhmiri2021hypernomad, title={HyperNOMAD: Hyperparameter Optimization of Deep Neural Networks Using Mesh Adaptive Direct Search}, author={Lakhmiri, Dounia and Digabel, S{\'e}bastien Le and Tribes, Christophe}, journal={ACM Transactions on Mathematical Software (TOMS)}, volume={47}, number={3}, pages={1--27}, year={2021}, publisher={ACM New York, NY, USA} }

``` DOI

Owner

  • Name: Blackbox Optimization
  • Login: bbopt
  • Kind: organization
  • Email: nomad@gerad.ca
  • Location: Montréal, Qc, Canada

GitHub organization of the research group from Polytechnique Montréal and GERAD for derivative-free and blackbox optimization

GitHub Events

Total
  • Fork event: 1
Last Year
  • Fork event: 1

Committers

Last synced: about 2 years ago

All Time
  • Total Commits: 235
  • Total Committers: 4
  • Avg Commits per committer: 58.75
  • Development Distribution Score (DDS): 0.2
Past Year
  • Commits: 0
  • Committers: 0
  • Avg Commits per committer: 0.0
  • Development Distribution Score (DDS): 0.0
Top Committers
Name Email Commits
DouniaLakhmiri d****i@g****m 188
Christophe Tribes c****s@p****a 36
ctribes c****s@g****a 10
Sébastien Le Digabel 3****l 1
Committer Domains (Top 20 + Academic)

Issues and Pull Requests

Last synced: about 2 years ago

All Time
  • Total issues: 3
  • Total pull requests: 14
  • Average time to close issues: about 1 month
  • Average time to close pull requests: about 10 hours
  • Total issue authors: 2
  • Total pull request authors: 2
  • Average comments per issue: 1.0
  • Average comments per pull request: 0.0
  • Merged pull requests: 12
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 1
  • Pull requests: 0
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Issue authors: 1
  • Pull request authors: 0
  • Average comments per issue: 3.0
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • ctribes (2)
  • catevlog (1)
Pull Request Authors
  • ctribes (13)
  • DouniaLakhmiri (1)
Top Labels
Issue Labels
Pull Request Labels
enhancement (1)