Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/modularml/max
A collection of sample programs, notebooks, and tools which highlight the power of the MAX Platform
https://github.com/modularml/max
ai genai llama2 llm ml onnx pytorch
Last synced: about 1 month ago
JSON representation
A collection of sample programs, notebooks, and tools which highlight the power of the MAX Platform
- Host: GitHub
- URL: https://github.com/modularml/max
- Owner: modularml
- License: other
- Created: 2024-02-15T21:58:11.000Z (9 months ago)
- Default Branch: main
- Last Pushed: 2024-10-09T00:25:05.000Z (about 1 month ago)
- Last Synced: 2024-10-10T06:32:07.688Z (about 1 month ago)
- Topics: ai, genai, llama2, llm, ml, onnx, pytorch
- Language: Python
- Homepage: https://www.modular.com
- Size: 5.15 MB
- Stars: 335
- Watchers: 24
- Forks: 42
- Open Issues: 28
-
Metadata Files:
- Readme: README.md
- License: LICENSE
- Code of conduct: CODE_OF_CONDUCT.md
Awesome Lists containing this project
README
# Welcome to MAX
The Modular Accelerated Xecution ([MAX](https://www.modular.com/max)) platform
is an integrated suite of AI libraries, tools, and technologies that unifies
commonly fragmented AI deployment workflows. MAX accelerates time to market
for the latest innovations by giving AI developers a single toolchain that
unlocks full programmability, unparalleled performance, and seamless hardware portability.
[See here to get started with MAX](https://docs.modular.com/max/get-started)
and when you want to report issues or request features,
[please create a GitHub issue here](https://github.com/modularml/max/issues/new/choose).The [Discord](https://discord.gg/modular) community is the best place to share
your experiences and chat with the team and other community members.In the [examples directory](https://github.com/modularml/max/tree/main/examples),
you will find code examples and Jupyter notebooks that show how to run inference
with MAX.## Getting Started
MAX is available in both stable and nightly builds. To install either version,
follow the guide to [create a project with
Magic](https://docs.modular.com/max/create-project).Then clone this repository:
```bash
git clone https://github.com/modularml/max.git
```If you installed the nightly build, be sure you switch to the `nightly` branch,
because the `main` branch is for stable releases and might not be compatible
with nightly builds:```bash
git checkout nightly
```## Running
### MAX Pipelines
To show off the full power of MAX, a
[series of end-to-end pipelines for common AI workloads](./examples/graph-api/pipelines/)
(and more) are ready to run. As one example, this includes everything needed to
self-host
[the Llama 3 text-generation model](./examples/graph-api/pipelines/llama3/).
These pipelines are completely written in Mojo, and all code is provided so
that they can be customized, built upon, or learned from.### Examples
In addition to the end-to-end pipelines, there are many [examples](./examples/)
that exercise various aspects of MAX. You can follow the instructions in the
README for each example or notebook you want to run.### Notebooks
Check out the [notebooks examples](./examples/notebooks/) for using MAX Engine
🏎️ for models such as- [Mistral-7B](./examples/notebooks/mistral7b-python-onnx.ipynb)
- [Roberta-pytorch](./examples/notebooks/roberta-python-pytorch.ipynb)### Tutorials
The [tutorials](./tutorials/) directory contains the "finished" code for
tutorials you can read at
[docs.modular.com/max/tutorials](https://docs.modular.com/max/tutorials).### Docker Container
To deploy MAX on AWS, you can pull our Docker Container from the the public ECR here:
[https://gallery.ecr.aws/modular/max-serving](https://gallery.ecr.aws/modular/max-serving)```public.ecr.aws/modular/max-serving```
## Contributing
Thanks for your interest in contributing to this repository!
We are not accepting pull requests yet.However, we welcome your bug reports. If you have a bug, please file an issue
[here](https://github.com/modularml/max/issues/new/choose).If you need support, the [Discord](https://discord.gg/modular)
community is the best place to share your experiences and chat with
the team and other community members.## License
This repository and its contributions are licensed under the Apache License
v2.0 with LLVM Exceptions (see the LLVM [License](https://llvm.org/LICENSE.txt)).
MAX and Mojo usage and distribution are licensed under the
[MAX & Mojo Community License](https://www.modular.com/legal/max-mojo-license).### Third Party Licenses
You are entirely responsible for checking and validating the licenses of
third parties (i.e. Huggingface) for related software and libraries that are downloaded.