https://github.com/zerollzeng/tiny-tensorrt
Deploy your model with TensorRT quickly.
https://github.com/zerollzeng/tiny-tensorrt
deep-learning onnx tensorrt
Last synced: 11 months ago
JSON representation
Deploy your model with TensorRT quickly.
- Host: GitHub
- URL: https://github.com/zerollzeng/tiny-tensorrt
- Owner: zerollzeng
- Created: 2019-08-22T10:10:19.000Z (over 6 years ago)
- Default Branch: master
- Last Pushed: 2023-11-21T13:25:19.000Z (about 2 years ago)
- Last Synced: 2024-08-01T03:27:15.805Z (over 1 year ago)
- Topics: deep-learning, onnx, tensorrt
- Language: C++
- Homepage:
- Size: 1.18 MB
- Stars: 755
- Watchers: 28
- Forks: 99
- Open Issues: 9
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README

**This Project is no longer maintained since we already have better alternatives for engine build, you can use TensorRT's python API, or make use of trtexec/polygraphy tool to build the engine quickly**
**For any issue about TensorRT, you can file issue against https://github.com/NVIDIA/TensorRT/issues**
## tiny-tensorrt
An easy-to-use nvidia TensorRT wrapper for onnx model with c++ and python api. you are able to deploy your model with tiny-tensorrt in few lines of code!
```c++
Trt* net = new Trt();
net->SetFP16();
net->BuildEngine(onnxModel, engineFile);
net->CopyFromHostToDevice(input, inputBindIndex);
net->Forward();
net->CopyFromDeviceToHost(output, outputBindIndex)
```
## Install
tiny-tensorrt rely on CUDA, CUDNN and TensorRT. Make sure you has installed those dependencies already. For a quick start, you can use [official docker](https://ngc.nvidia.com/catalog/containers/nvidia:tensorrt)
Support CUDA version: 10.2, 11.0, 11.1, 11.2, 11.3, 11.4
Support TensorRT version: 7.0, 7.1, 7.2, 8.0, 8.2 8.4
To build tiny-tensorrt, you also need some extra packages.
```bash
sudo apt-get update -y
sudo apt-get install cmake zlib1g-dev
## this is for python binding
sudo apt-get install python3 python3-pip
pip3 install numpy
## clone project and submodule
git clone --recurse-submodules -j8 https://github.com/zerollzeng/tiny-tensorrt.git
cd tiny-tensorrt
mkdir build && cd build
cmake .. && make
```
Then you can intergrate it into your own project with libtinytrt.so and Trt.h, for python module, you get pytrt.so
## Docs
Please refer to [Wiki](https://github.com/zerollzeng/tiny-tensorrt/wiki)
## About License
For the 3rd-party module and TensorRT, you need to follow their license
For the part I wrote, you can do anything you want