https://github.com/hidet-org/hidet
An open-source efficient deep learning framework/compiler, written in python.
https://github.com/hidet-org/hidet
compiler deep-learning framework inference
Last synced: about 1 month ago
JSON representation
An open-source efficient deep learning framework/compiler, written in python.
- Host: GitHub
- URL: https://github.com/hidet-org/hidet
- Owner: hidet-org
- License: apache-2.0
- Created: 2022-01-04T02:52:45.000Z (over 3 years ago)
- Default Branch: main
- Last Pushed: 2025-02-25T17:50:29.000Z (2 months ago)
- Last Synced: 2025-03-16T20:38:02.526Z (about 2 months ago)
- Topics: compiler, deep-learning, framework, inference
- Language: Python
- Homepage: https://hidet.org
- Size: 4.6 MB
- Stars: 691
- Watchers: 19
- Forks: 58
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- License: LICENSE
- Code of conduct: CODE_OF_CONDUCT.md
Awesome Lists containing this project
- awesome-tensor-compilers - Hidet: A Compilation-based Deep Learning Framework
README
# Hidet: An Open-Source Deep Learning Compiler
[**Documentation**](http://hidet.org/docs) |
[**Research Paper**](https://dl.acm.org/doi/10.1145/3575693.3575702) |
[**Releases**](https://github.com/hidet-org/hidet/releases) |
[**Contributing**](https://hidet.org/docs/stable/developer-guides/contributing.html)
Hidet is an open-source deep learning compiler, written in Python.
It supports end-to-end compilation of DNN models from PyTorch and ONNX to efficient cuda kernels.
A series of graph-level and operator-level optimizations are applied to optimize the performance.Currently, hidet focuses on optimizing the inference workloads on NVIDIA GPUs, and requires
- Linux OS
- CUDA Toolkit 11.6+
- Python 3.9+## Getting Started
### Installation
If you are going to use hidet's API
```bash
pip install hidet
```If you are going to use hidet as PyTorch compiler
```bash
pip install hidet[torch]
```You can also try the [nightly build version](https://docs.hidet.org/stable/getting-started/install.html) or [build from source](https://docs.hidet.org/stable/getting-started/build-from-source.html#).
### Usage
Optimize a PyTorch model through hidet (require PyTorch 2.3):
```python
import torch# Define pytorch model
model = torch.hub.load('pytorch/vision:v0.6.0', 'resnet18', pretrained=True).cuda().eval()
x = torch.rand(1, 3, 224, 224).cuda()# Compile the model through Hidet
# Optional: set optimization options (see our documentation for more details)
# import hidet
# hidet.torch.dynamo_config.search_space(2) # tune each tunable operator
model_opt = torch.compile(model, backend='hidet')# Run the optimized model
y = model_opt(x)
```
See the following tutorials to learn other usages:
- [Quick Start](http://hidet.org/docs/stable/gallery/getting-started/quick-start.html)
- [Optimize PyTorch models](http://hidet.org/docs/stable/gallery/tutorials/optimize-pytorch-model.html)
- [Optimize ONNX models](http://hidet.org/docs/stable/gallery/tutorials/optimize-onnx-model.html)## Publication
Hidet originates from the following research work:> **Hidet: Task-Mapping Programming Paradigm for Deep Learning Tensor Programs**
> Yaoyao Ding, Cody Hao Yu, Bojian Zheng, Yizhi Liu, Yida Wang, and Gennady Pekhimenko.
> ASPLOS '23If you used **Hidet** in your research, welcome to cite our
[paper](https://dl.acm.org/doi/10.1145/3575693.3575702).## Development
Hidet is currently under active development by a team at [CentML Inc](https://centml.ai/).## Contributing
We welcome contributions from the community. Please see
[contribution guide](https://hidet.org/docs/stable/developer-guides/contributing.html)
for more details.## License
Hidet is released under the [Apache 2.0 license](LICENSE).