https://github.com/cmower/hyparam
Container for hyper parameter tuning in machine learning.
Science Score: 23.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
○CITATION.cff file
-
✓codemeta.json file
Found codemeta.json file -
○.zenodo.json file
-
○DOI references
-
○Academic publication links
-
✓Committers with academic emails
1 of 2 committers (50.0%) from academic institutions -
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (6.5%) to scientific vocabulary
Keywords
Repository
Container for hyper parameter tuning in machine learning.
Basic Info
Statistics
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
- Releases: 0
Topics
Metadata Files
README.md
hyparam
Container for hyper parameter tuning in machine learning.
Example
Load programmatically
You can add parameters to the parameter search space by using the following methods.
python
hp = HyperParameters()
hp.add_linspace("learning_rate", 0.1, 0.3, 3)
hp.add_switch("use_test_dataset")
hp.add_range("epochs", 10, 30, 10)
hp.add_list("myvar", [1.0, 12.0, 8.0])
Load from file
You can instead specify a parameter space in a YAML configuration file.
```yaml learning_rate: type: linspace setup: lower: 0.1 upper: 0.3 num: 3
usetestdataset: type: switch
epochs: type: range setup: start: 10 stop: 30 step: 10
myvar: type: list setup: values: [1.0, 12.0, 8.0] ```
This is loaded into Python as follows.
python
hp = HyperParameters.from_file(file_name)
Iterating over the parameter space
In both the above examples, you can iterate over the parameter space using the choices method.
See the examples in the example directory.
You should expect the following output when you run these.
choice(learning_rate=0.1, use_test_dataset=True, epochs=10, myvar=1.0)
choice(learning_rate=0.1, use_test_dataset=True, epochs=10, myvar=12.0)
choice(learning_rate=0.1, use_test_dataset=True, epochs=10, myvar=8.0)
choice(learning_rate=0.1, use_test_dataset=True, epochs=20, myvar=1.0)
choice(learning_rate=0.1, use_test_dataset=True, epochs=20, myvar=12.0)
choice(learning_rate=0.1, use_test_dataset=True, epochs=20, myvar=8.0)
choice(learning_rate=0.1, use_test_dataset=False, epochs=10, myvar=1.0)
choice(learning_rate=0.1, use_test_dataset=False, epochs=10, myvar=12.0)
choice(learning_rate=0.1, use_test_dataset=False, epochs=10, myvar=8.0)
choice(learning_rate=0.1, use_test_dataset=False, epochs=20, myvar=1.0)
choice(learning_rate=0.1, use_test_dataset=False, epochs=20, myvar=12.0)
choice(learning_rate=0.1, use_test_dataset=False, epochs=20, myvar=8.0)
choice(learning_rate=0.2, use_test_dataset=True, epochs=10, myvar=1.0)
choice(learning_rate=0.2, use_test_dataset=True, epochs=10, myvar=12.0)
choice(learning_rate=0.2, use_test_dataset=True, epochs=10, myvar=8.0)
choice(learning_rate=0.2, use_test_dataset=True, epochs=20, myvar=1.0)
choice(learning_rate=0.2, use_test_dataset=True, epochs=20, myvar=12.0)
choice(learning_rate=0.2, use_test_dataset=True, epochs=20, myvar=8.0)
choice(learning_rate=0.2, use_test_dataset=False, epochs=10, myvar=1.0)
choice(learning_rate=0.2, use_test_dataset=False, epochs=10, myvar=12.0)
choice(learning_rate=0.2, use_test_dataset=False, epochs=10, myvar=8.0)
choice(learning_rate=0.2, use_test_dataset=False, epochs=20, myvar=1.0)
choice(learning_rate=0.2, use_test_dataset=False, epochs=20, myvar=12.0)
choice(learning_rate=0.2, use_test_dataset=False, epochs=20, myvar=8.0)
choice(learning_rate=0.3, use_test_dataset=True, epochs=10, myvar=1.0)
choice(learning_rate=0.3, use_test_dataset=True, epochs=10, myvar=12.0)
choice(learning_rate=0.3, use_test_dataset=True, epochs=10, myvar=8.0)
choice(learning_rate=0.3, use_test_dataset=True, epochs=20, myvar=1.0)
choice(learning_rate=0.3, use_test_dataset=True, epochs=20, myvar=12.0)
choice(learning_rate=0.3, use_test_dataset=True, epochs=20, myvar=8.0)
choice(learning_rate=0.3, use_test_dataset=False, epochs=10, myvar=1.0)
choice(learning_rate=0.3, use_test_dataset=False, epochs=10, myvar=12.0)
choice(learning_rate=0.3, use_test_dataset=False, epochs=10, myvar=8.0)
choice(learning_rate=0.3, use_test_dataset=False, epochs=20, myvar=1.0)
choice(learning_rate=0.3, use_test_dataset=False, epochs=20, myvar=12.0)
choice(learning_rate=0.3, use_test_dataset=False, epochs=20, myvar=8.0)
Install
From source
In a new terminal:
1. Clone repository:
- (ssh) $ git clone git@github.com:cmower/hyparam.git, or
- (https) $ git clone https://github.com/cmower/hyparam.git
2. Change directory: $ cd hyparam
3. Ensure pip is up-to-date: $ python -m pip install --upgrade pip
3. Install: $ pip install .
Owner
- Name: Chris Mower
- Login: cmower
- Kind: user
- Location: London, UK
- Company: Huawei Technologies R&D
- Website: https://cmower.github.io/
- Repositories: 55
- Profile: https://github.com/cmower
Senior Research Scientist at Huawei Technologies R&D.
GitHub Events
Total
Last Year
Committers
Last synced: over 1 year ago
Top Committers
| Name | Commits | |
|---|---|---|
| Christopher E. Mower | c****r@k****k | 14 |
| Chris Mower | c****r | 1 |
Committer Domains (Top 20 + Academic)
Issues and Pull Requests
Last synced: 11 months ago
All Time
- Total issues: 0
- Total pull requests: 0
- Average time to close issues: N/A
- Average time to close pull requests: N/A
- Total issue authors: 0
- Total pull request authors: 0
- Average comments per issue: 0
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Past Year
- Issues: 0
- Pull requests: 0
- Average time to close issues: N/A
- Average time to close pull requests: N/A
- Issue authors: 0
- Pull request authors: 0
- Average comments per issue: 0
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0