hierarchical_nas_construction
Official repository for "Construction of Hierarchical Neural Architecture Search Spaces based on Context-free Grammars" (NeurIPS 2023)
Science Score: 64.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
○DOI references
-
✓Academic publication links
Links to: arxiv.org -
✓Committers with academic emails
1 of 1 committers (100.0%) from academic institutions -
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (11.3%) to scientific vocabulary
Keywords
Repository
Official repository for "Construction of Hierarchical Neural Architecture Search Spaces based on Context-free Grammars" (NeurIPS 2023)
Basic Info
Statistics
- Stars: 18
- Watchers: 9
- Forks: 5
- Open Issues: 1
- Releases: 0
Topics
Metadata Files
README.md
Construction of Hierarchical Neural Architecture Search Spaces based on Context-free Grammars
Note that this repository only contains the implementation of the search spaces, evaluation pipelines, and experiment scripts. The main implementation (e.g., how to construct architectures etc.) of our approach is implemented as part of the NePS project.
This repository contains the implementation of the experiments of our NeurIPS 2023 paper "Construction of Hierarchical Neural Architecture Search Spaces based on Context-free Grammars" that takes a functional view on neural architecture search by constructing architectures based on context-free grammars.
If you would like to learn more about our work, please refer to our paper.
If you find our approach interesting for your own work, please cite the paper:
@inproceedings{schrodi2023construction,
title={Construction of Hierarchical Neural Architecture Search Spaces based on Context-free Grammars},
author={Schrodi, Simon and Stoll, Danny and Ru, Binxin and Sukthanker, Rhea and Brox, Thomas and Hutter, Frank},
booktitle={Advances in Neural Information Processing Systems},
year={2023},
}
A short form of this work was also previously presented in the NeurIPS2022 Meta-Learning Workshop with the title "Towards Discovering Neural Architectures from Scratch".
1. Installation
Clone this repository.
Create a conda environment
bash
conda create -n hnas python=3.7
and activate it
bash
conda activate hnas
- Install poetry
bash
bash install_dev_utils/poetry.sh
- Run
poetry install(this can take quite a while) and then runpip install opencv-python.
2. Reproducing the paper results
2.1 Search on the cell-based or hierarchical NAS-Bench-201 search space
To reproduce those search experiments, run
bash
python experiments/optimize.py \
--working_directory $working_directory \
--data_path $data_path \
--search_space $search_space \
--objective $objective \
--searcher $searcher \
--surrogate_model $surrogate_model \
--seed $seed \
--pool_strategy evolution \
--pool_size 200 \
--n_init 10 \
--log \
--p_self_crossover 0.5
where $working_directory and $data_path are the directory you want to save to or the path to the dataset, respectively. The other variables can be set as follows:
| variable | options |
|--------------------------|-------------------------------------------------------------------|
| search_space | nb201_variable_multi_multi (hierarchical) or nb201_fixed_1_none (cell-based) |
| objective | nb201_cifar10, nb201_cifar100, nb201_ImageNet16-120, nb201_cifarTile, or nb201_addNIST |
| searcher | bayesian_optimization, random search, regularized_evolution, or àssisted_regularized_evolution |
| surrogate_model | gpwl_hierarchical (hWL), gpwl (WL), or gp_nasbot (NASBOT) (only active if searcher is set to bayesian_optimization) |
| seed | 777, 888, 999 |
To run DARTS (or improved versions of DARTS) on the cell-based NAS-Bench-201 search space, run
bash
python darts_train_search.py \
--working_directory $working_directory \
--data_path $data_path \
--objective $objective \
--seed $seed \
--method $method
where working_directory and data_path are the directory you want to save to or the path to the dataset, respectively. The other variables can be set as follows:
| variable | options |
|--------------------------|-------------------------------------------------------------------|
| objective | nb201_cifar10, nb201_cifar100, nb201_ImageNet16-120, nb201_cifarTile, or nb201_addNIST |
| seed | 777, 888, 999 |
| method | darts, dirichlet |
Note that add the --progressive flag to the above command to run DrNAS with progressive learning scheme.
To evaluate the found architectures, run
bash
python $WORKDIR/hierarchical_nas_experiments/darts_evaluate.py \
--working_directory $working_directory \
--data_path $data_path \
--objective $objective
where working_directory and data_path are the directory you saved the data to or the path to the dataset, respectively. The other variable can be set as follows:
| variable | options |
|--------------------------|-------------------------------------------------------------------|
| objective | nb201_cifar10, nb201_cifar100, nb201_ImageNet16-120, nb201_cifarTile, or nb201_addNIST |
Note that DARTS (or improved versions of it) cannot be applied with further adoption to our hierarchical NAS-Bench-201 search space due to the exponential number of parameters that the supernet would contain.
2.2 Search on the activation function search space
To reproduce this search experiment, run
bash
python experiments/optimize.py \
--working_directory $working_directory \
--data_path $data_path \
--search_space act_cifar10 \
--objective act_cifar10 \
--searcher $searcher \
--surrogate_model $surrogate_model \
--seed $seed \
--pool_strategy evolution \
--pool_size 200 \
--n_init 50 \
--log \
--p_self_crossover 0.5 \
--max_evaluations_total 1000
where $working_directory and $data_path are the directory you want to save to or the path to the dataset, respectively.
The other variables can be set as follows:
| variable | options |
|--------------------------|-------------------------------------------------------------------|
| searcher | bayesian_optimization, random search, or regularized_evolution |
| surrogate_model | gpwl_hierarchical (hWL), gpwl (WL), or gp_nasbot (NASBOT) (only active if searcher is set to bayesian_optimization) |
| seed | 777, 888, 999 (note that we only ran on the seed 777 in our experiments) |
2.3 Surrogate experiments
Search has to be run beforehand or data needs to be provided!
To reproduce our surrogate experiments, run
bash
python experiments/surrogate_regression.py \
--working_directory $working_directory \
--search_space $search_space \
--objective $objective \
--surrogate_model $surrogate_model \
--n_train $n_train \
--log
where $working_directory is the directory where the data from the search runs has been saved to and the surrogate results will be saved to. Other variables can be set as follows:
| variable | options |
|--------------------------|-------------------------------------------------------------------|
| search_space | nb201_variable_multi_multi (hierarchical) or nb201_fixed_1_none (cell-based) |
| objective | nb201_cifar10, nb201_cifar100, nb201_ImageNet16-120, nb201_cifarTile, or nb201_addNIST |
| surrogate_model | gpwl_hierarchical (hWL), gpwl (WL), or nasbot (NASBOT) (only active if searcher is set to bayesian_optimization) |
| n_train | 10, 25, 50, 75, 100, 150, 200, 300, or 400 |
2.4 Zero-cost proxy experiments
Search has to be run beforehand or data needs to be provided!
To reproduce our zero-cost proxy rank correlation experiments, run
bash
python experiments/zero_cost_proxy_rank_correlation.py \
--working_directory $working_directory \
--search_space $search_space \
--objective $objective \
--data_path $data_path \
--log
where $working_directory and $data_path are the directory you want to save to and the data from the search runs has been saved to or the path to the dataset, respectively.
Other variables can be set as follows:
| variable | options |
|--------------------------|-------------------------------------------------------------------|
| search_space | nb201_variable_multi_multi (hierarchical) or nb201_fixed_1_none (cell-based) |
| objective | nb201_cifar10, nb201_cifar100, nb201_ImageNet16-120, nb201_cifarTile, or nb201_addNIST |
2.5 NASWOT search experiment
To reproduce the NASWOT search experiment, run
bash
python experiments/optimize_naswot.py \
--working_directory $working_directory \
--search_space $search_space \
--objective $objective \
--data_path $data_path \
--seed $seed \
--naslib
where $working_directory and $data_path are the directory you want to save to or the path to the dataset, respectively. The other variables can be set as follows:
| variable | options |
|--------------------------|-------------------------------------------------------------------|
| search_space | nb201_variable_multi_multi (hierarchical) or nb201_fixed_1_none (cell-based) |
| objective | nb201_cifar10, nb201_cifar100, nb201_ImageNet16-120, nb201_cifarTile, or nb201_addNIST |
| seed | 777, 888, 999 |
2.6 DARTS search experiment
Code is coming soon
2.7 Transformer search experiment
Code is coming soon
3. Acknowledgements
We thank the authors of following works for open sourcing their code: - NAS-BOWL: GPWL surrogate model base implementation, NASBOT's graph encoding scheme - NASLib: base graph class, zero-cost proxies - NAS-Bench-201: training protocols of NAS-Bench-201 search space - CVPR-NAS 2021 Competition Track 3: dataset generation and training protocols for AddNIST and CIFARTile - NASWOT: implementation of zero-cost proxy search - DARTS, DrNAS: implementation of DARTS training pipeline and DARTS (+ improved versions) search algorithms
Owner
- Name: AutoML-Freiburg-Hannover
- Login: automl
- Kind: organization
- Location: Freiburg and Hannover, Germany
- Website: www.automl.org
- Repositories: 186
- Profile: https://github.com/automl
Citation (CITATION.cff)
cff-version: 1.2.0
message: "If you find our approach interesting for your own work, please cite the corresponding paper."
authors:
- family-names: Schrodi
given-names: Simon
- family-names: Stoll
given-names: Danny
- family-names: Ru
given-names: Binxin
- family-names: Sukthanker
given-names: Rhea
- family-names: Brox
given-names: Thomas
- family-names: Hutter
given-names: Frank
title: "Towards Discovering Neural Architectures from Scratch"
version: 0.1.0
date-released: 2022-11-03
url: "https://github.com/automl/towards_nas_from_scratch"
preferred-citation:
type: misc
doi: 10.48550/ARXIV.2211.01842
url: "https://arxiv.org/abs/2211.01842"
authors:
- family-names: Schrodi
given-names: Simon
- family-names: Stoll
given-names: Danny
- family-names: Ru
given-names: Binxin
- family-names: Sukthanker
given-names: Rhea
- family-names: Brox
given-names: Thomas
- family-names: Hutter
given-names: Frank
keywords: "Machine Learning (cs.LG), Artificial Intelligence (cs.AI), Computer Vision and Pattern Recognition (cs.CV), Machine Learning (stat.ML), FOS: Computer and information sciences, FOS: Computer and information sciences"
title: "Towards Discovering Neural Architectures from Scratch"
publisher: arXiv
year: 2022
copyright: "arXiv.org perpetual, non-exclusive license"
GitHub Events
Total
- Watch event: 4
- Fork event: 1
Last Year
- Watch event: 4
- Fork event: 1
Committers
Last synced: 8 months ago
Top Committers
| Name | Commits | |
|---|---|---|
| schrodi | s****i@c****e | 18 |
Committer Domains (Top 20 + Academic)
Issues and Pull Requests
Last synced: 8 months ago
All Time
- Total issues: 2
- Total pull requests: 0
- Average time to close issues: 2 months
- Average time to close pull requests: N/A
- Total issue authors: 2
- Total pull request authors: 0
- Average comments per issue: 0.5
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Past Year
- Issues: 1
- Pull requests: 0
- Average time to close issues: N/A
- Average time to close pull requests: N/A
- Issue authors: 1
- Pull request authors: 0
- Average comments per issue: 0.0
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Top Authors
Issue Authors
- akhauriyash (1)
- wupeihan248 (1)