https://github.com/cmower/hyparam
Container for hyper parameter tuning in machine learning.
https://github.com/cmower/hyparam
hyperparameter-tuning machine-learning
Last synced: about 2 months ago
JSON representation
Container for hyper parameter tuning in machine learning.
- Host: GitHub
- URL: https://github.com/cmower/hyparam
- Owner: cmower
- License: gpl-3.0
- Created: 2023-05-11T16:39:47.000Z (about 2 years ago)
- Default Branch: master
- Last Pushed: 2023-05-11T20:14:57.000Z (about 2 years ago)
- Last Synced: 2025-02-01T12:08:51.266Z (4 months ago)
- Topics: hyperparameter-tuning, machine-learning
- Language: Python
- Homepage:
- Size: 24.4 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# hyparam
Container for hyper parameter tuning in machine learning.
# Example
## Load programmatically
You can add parameters to the parameter search space by using the following methods.
```python
hp = HyperParameters()
hp.add_linspace("learning_rate", 0.1, 0.3, 3)
hp.add_switch("use_test_dataset")
hp.add_range("epochs", 10, 30, 10)
hp.add_list("myvar", [1.0, 12.0, 8.0])
```## Load from file
You can instead specify a parameter space in a YAML configuration file.
```yaml
learning_rate:
type: linspace
setup:
lower: 0.1
upper: 0.3
num: 3use_test_dataset:
type: switchepochs:
type: range
setup:
start: 10
stop: 30
step: 10myvar:
type: list
setup:
values: [1.0, 12.0, 8.0]
```This is loaded into Python as follows.
```python
hp = HyperParameters.from_file(file_name)
```## Iterating over the parameter space
In both the above examples, you can iterate over the parameter space using the `choices` method.
See the examples in the [example](example/) directory.
You should expect the following output when you run these.```
choice(learning_rate=0.1, use_test_dataset=True, epochs=10, myvar=1.0)
choice(learning_rate=0.1, use_test_dataset=True, epochs=10, myvar=12.0)
choice(learning_rate=0.1, use_test_dataset=True, epochs=10, myvar=8.0)
choice(learning_rate=0.1, use_test_dataset=True, epochs=20, myvar=1.0)
choice(learning_rate=0.1, use_test_dataset=True, epochs=20, myvar=12.0)
choice(learning_rate=0.1, use_test_dataset=True, epochs=20, myvar=8.0)
choice(learning_rate=0.1, use_test_dataset=False, epochs=10, myvar=1.0)
choice(learning_rate=0.1, use_test_dataset=False, epochs=10, myvar=12.0)
choice(learning_rate=0.1, use_test_dataset=False, epochs=10, myvar=8.0)
choice(learning_rate=0.1, use_test_dataset=False, epochs=20, myvar=1.0)
choice(learning_rate=0.1, use_test_dataset=False, epochs=20, myvar=12.0)
choice(learning_rate=0.1, use_test_dataset=False, epochs=20, myvar=8.0)
choice(learning_rate=0.2, use_test_dataset=True, epochs=10, myvar=1.0)
choice(learning_rate=0.2, use_test_dataset=True, epochs=10, myvar=12.0)
choice(learning_rate=0.2, use_test_dataset=True, epochs=10, myvar=8.0)
choice(learning_rate=0.2, use_test_dataset=True, epochs=20, myvar=1.0)
choice(learning_rate=0.2, use_test_dataset=True, epochs=20, myvar=12.0)
choice(learning_rate=0.2, use_test_dataset=True, epochs=20, myvar=8.0)
choice(learning_rate=0.2, use_test_dataset=False, epochs=10, myvar=1.0)
choice(learning_rate=0.2, use_test_dataset=False, epochs=10, myvar=12.0)
choice(learning_rate=0.2, use_test_dataset=False, epochs=10, myvar=8.0)
choice(learning_rate=0.2, use_test_dataset=False, epochs=20, myvar=1.0)
choice(learning_rate=0.2, use_test_dataset=False, epochs=20, myvar=12.0)
choice(learning_rate=0.2, use_test_dataset=False, epochs=20, myvar=8.0)
choice(learning_rate=0.3, use_test_dataset=True, epochs=10, myvar=1.0)
choice(learning_rate=0.3, use_test_dataset=True, epochs=10, myvar=12.0)
choice(learning_rate=0.3, use_test_dataset=True, epochs=10, myvar=8.0)
choice(learning_rate=0.3, use_test_dataset=True, epochs=20, myvar=1.0)
choice(learning_rate=0.3, use_test_dataset=True, epochs=20, myvar=12.0)
choice(learning_rate=0.3, use_test_dataset=True, epochs=20, myvar=8.0)
choice(learning_rate=0.3, use_test_dataset=False, epochs=10, myvar=1.0)
choice(learning_rate=0.3, use_test_dataset=False, epochs=10, myvar=12.0)
choice(learning_rate=0.3, use_test_dataset=False, epochs=10, myvar=8.0)
choice(learning_rate=0.3, use_test_dataset=False, epochs=20, myvar=1.0)
choice(learning_rate=0.3, use_test_dataset=False, epochs=20, myvar=12.0)
choice(learning_rate=0.3, use_test_dataset=False, epochs=20, myvar=8.0)
```# Install
## From source
In a new terminal:
1. Clone repository:
- (ssh) `$ git clone [email protected]:cmower/hyparam.git`, or
- (https) `$ git clone https://github.com/cmower/hyparam.git`
2. Change directory: `$ cd hyparam`
3. Ensure `pip` is up-to-date: `$ python -m pip install --upgrade pip`
3. Install: `$ pip install .`