Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/logicalclocks/maggy
Distribution transparent Machine Learning experiments on Apache Spark
https://github.com/logicalclocks/maggy
ablation ablation-studies ablation-study automl blackbox-optimization hyperparameter-optimization hyperparameter-search hyperparameter-tuning spark
Last synced: 9 days ago
JSON representation
Distribution transparent Machine Learning experiments on Apache Spark
- Host: GitHub
- URL: https://github.com/logicalclocks/maggy
- Owner: logicalclocks
- License: apache-2.0
- Created: 2019-02-18T14:53:13.000Z (almost 6 years ago)
- Default Branch: master
- Last Pushed: 2024-02-21T15:57:45.000Z (12 months ago)
- Last Synced: 2025-01-16T11:24:27.020Z (16 days ago)
- Topics: ablation, ablation-studies, ablation-study, automl, blackbox-optimization, hyperparameter-optimization, hyperparameter-search, hyperparameter-tuning, spark
- Language: Python
- Homepage: https://maggy.ai
- Size: 5.74 MB
- Stars: 90
- Watchers: 11
- Forks: 14
- Open Issues: 8
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
Awesome Lists containing this project
- awesome-production-machine-learning - Maggy - Asynchronous, directed Hyperparameter search and parallel ablation studies on Apache Spark [(Video)](https://www.youtube.com/watch?v=0Hd1iYEL03w). (Neural Architecture Search)
- Awesome-AIML-Data-Ops - Maggy - Asynchronous, directed Hyperparameter search and parallel ablation studies on Apache Spark [(Video)](https://www.youtube.com/watch?v=0Hd1iYEL03w). (Neural Architecture Search)
- awesome-production-machine-learning - Maggy - Asynchronous, directed Hyperparameter search and parallel ablation studies on Apache Spark - [(Video)](https://www.youtube.com/watch?v=0Hd1iYEL03w). (AutoML)
README
Maggy is a framework for **distribution transparent** machine learning experiments on [Apache Spark](https://spark.apache.org/).
In this post, we introduce a new unified framework for writing core ML training logic as **oblivious training functions**.
Maggy enables you to reuse the same training code whether training small models on your laptop or reusing the same code to scale out hyperparameter tuning or distributed deep learning on a cluster.
Maggy enables the replacement of the current waterfall development process for distributed ML applications, where code is rewritten at every stage to account for the different distribution context.
Maggy uses the same distribution transparent training function in all steps of the machine learning development process.
## Quick Start
Maggy uses PySpark as an engine to distribute the training processes. To get started, install Maggy in the Python environment used by your Spark Cluster, or install Maggy in your local Python environment with the `'spark'` extra, to run on Spark in local mode:
```python
pip install maggy
```The programming model consists of wrapping the code containing the model training
inside a function. Inside that wrapper function provide all imports and
parts that make up your experiment.Single run experiment:
```python
def train_fn():
# This is your training iteration loop
for i in range(number_iterations):
...
# add the maggy reporter to report the metric to be optimized
reporter.broadcast(metric=accuracy)
...
# Return metric to be optimized or any metric to be logged
return accuracyfrom maggy import experiment
result = experiment.lagom(train_fn=train_fn, name='MNIST')
```**lagom** is a Swedish word meaning "just the right amount". This is how MAggy
uses your resources.## Documentation
Full documentation is available at [maggy.ai](https://maggy.ai/)
## Contributing
There are various ways to contribute, and any contribution is welcome, please follow the
CONTRIBUTING guide to get started.## Issues
Issues can be reported on the official [GitHub repo](https://github.com/logicalclocks/maggy/issues) of Maggy.
## Citation
Please see our publications on [maggy.ai](https://maggy.ai/publications) to find out how to cite our work.
## Acknowledgements
The development of Maggy is supported by the EU H2020 Deep Cube Project (Grant agreement ID: 101004188).