Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/captain-pool/hydra-wandb-sweeper
WandB sweeps integration with Hydra sweeper
https://github.com/captain-pool/hydra-wandb-sweeper
Last synced: 3 months ago
JSON representation
WandB sweeps integration with Hydra sweeper
- Host: GitHub
- URL: https://github.com/captain-pool/hydra-wandb-sweeper
- Owner: captain-pool
- Created: 2022-02-15T22:26:37.000Z (over 2 years ago)
- Default Branch: main
- Last Pushed: 2024-01-26T16:01:04.000Z (10 months ago)
- Last Synced: 2024-07-03T23:45:39.949Z (4 months ago)
- Language: Python
- Homepage:
- Size: 28.3 KB
- Stars: 44
- Watchers: 4
- Forks: 6
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
## WandB sweeper integration with Hydra
### Installing
```bash
$ pip install git+https://github.com/captain-pool/hydra-wandb-sweeper.git
```## Usage
Using this plugin is very simple, as one can just invoke the sweeper by overriding `hydra/sweeper` with `wandb`.
```python
@hydra.main(config_path=..., config_name=...)
def main(cfg: DictConfig):with wandb.init(id=run_id) as run:
...
wandb.log({"loss": loss})
```## Hydra Overrides
### Passing Distributions for continuous parameters
Sweeping over continuous params (for ex. learning rates in ML pipelines) can be done by passing the upper limit and lower with the hydra's `interval(, )` override. For example:
```
$ python3 /path/to/trainer/file hydra/sweeper=wandb model.learning_rate="interval(0.1, 0.2)"
```By default, the `interval()` sweep uses the uniform distribution while making parameter suggestions. To use a different distribution add the name of the supported distribution (`uniform`, `log_uniform`, `q_uniform`, `q_log_uniform`, `q_normal`, `log_normal`, `q_log_normal`) as a [`tag`](https://hydra.cc/docs/advanced/override_grammar/extended/#tag) to the [`interval()`](https://hydra.cc/docs/advanced/override_grammar/extended/#interval-sweep) sweep. For example, to use a normal distribution with `mean=0` and `variance=1` to suggest a value for `learning_rate` use the following:
```
$ python3 /path/to/trainer/file hydra/sweeper=wandb model.learning_rate="tag(normal, interval(0, 1))"
```### Categorial parameters
Sweeping over categorical items is as simple as passing a list of items to sweep over thereby directly using Hydra' [`choice()`](https://hydra.cc/docs/advanced/override_grammar/extended/#choice-sweep) sweep feature. for example, to sweep over a categorical list of `batch_size`,
```
$ python3 /path/to/trainer/file hydra/sweeper=wandb model.batch_size=8,10,12,14
```
Equivalently, the `choice()` command can also be expliclty used:
```
$python3 /path/to/trainer/file hydra/sweep=wandb model.batch_size="choice(8,10,12,14)".
```Categorical sweep can also be done using hydra's [`range()`](https://hydra.cc/docs/advanced/override_grammar/extended/#range-sweep) sweep. The previous task can be equivalently achieved as:
```
$ python3 /path/to/trainer/file hydra/sweeper=wandb model.batch_size="range(8, 15, step=2)"
```