An open API service indexing awesome lists of open source software.

https://github.com/princep/tensorrt-sample-on-threads

A tutorial for getting started on running Tensorrt engine and Deep Learning Accelerator (DLA) models on threads
https://github.com/princep/tensorrt-sample-on-threads

cpp deep-learning-accelerator dla mnist nvcc tensorrt tensorrt-inference threads

Last synced: 4 months ago
JSON representation

A tutorial for getting started on running Tensorrt engine and Deep Learning Accelerator (DLA) models on threads

Awesome Lists containing this project

README

        

# Tensorrt sample for running engines on different threads

Here is a sample to run GPU and DLAs at the same time.

1. Prepare TensorRT engine of GPU and DLA with trtexec. For example

For GPU
```sh
trtexec --onnx=/usr/src/tensorrt/data/mnist/mnist.onnx --saveEngine=gpu.engine
```

For DLA
```sh
trtexec --onnx=/usr/src/tensorrt/data/mnist/mnist.onnx --useDLACore=0 --allowgPUFallback --saveEngine=dla.engine
```

2. Compile the repo

```sh
make
```

3. Test

Please put the gu.engine and da.engine generated in step l to the cloned repo.

The command runs like this
```sh
./test ... # -1: GPU, 0: DLAO, 1: DLA1
```

Ex: Run GPU+DLA0+DLA1
```sh
./test -1 0 1
```