Projects in Awesome Lists tagged with tensorrt-inference
A curated list of projects in awesome lists tagged with tensorrt-inference .
https://github.com/jolibrain/deepdetect
Deep Learning API and Server in C++14 support for PyTorch,TensorRT, Dlib, NCNN, Tensorflow, XGBoost and TSNE
caffe deep-learning gpu image-classification image-search image-segmentation machine-learning ncnn neural-nets object-detection pytorch rest-api tensorrt tensorrt-conversion tensorrt-inference time-series tsne xgboost
Last synced: 10 Apr 2025
https://github.com/LCH1238/bevdet-tensorrt-cpp
BEVDet implemented by TensorRT, C++; Achieving real-time performance on Orin
3ddetection bev tensorrt-inference
Last synced: 19 Mar 2025
https://github.com/kamalkraj/stable-diffusion-tritonserver
Deploy stable diffusion model with onnx/tenorrt + tritonserver
deploy docker fp16 inference machine-learning nvidia onnx python3 pytorch stablediffusion tensorrt tensorrt-inference transformers triton-inference-server
Last synced: 12 Apr 2025
https://github.com/NVIDIA-ISAAC-ROS/isaac_ros_dnn_inference
NVIDIA-accelerated DNN model inference ROS 2 packages using NVIDIA Triton/TensorRT for both Jetson and x86_64 with CUDA-capable GPU
ai deep-learning deeplearning dnn gpu jetson nvidia ros ros2 ros2-humble tao tensorrt tensorrt-inference triton triton-inference-server
Last synced: 18 Mar 2025
https://github.com/yuvraj108c/ComfyUI-Depth-Anything-Tensorrt
ComfyUI Depth Anything (v1/v2) Tensorrt Custom Node (up to 14x faster)
comfyui comfyui-nodes depth-anything depth-map onnx stable-diffusion tensorrt tensorrt-inference
Last synced: 19 Dec 2024
https://github.com/koldim2001/trafficanalyzer
Анализ трафика на круговом движении с использованием компьютерного зрения
bytetrack bytetracker docker docker-compose flask grafana grafana-dashboard hydra influxdb multiple-object-tracking multiprocessing nginx object-detection oop-principles postgresql tensorrt-inference traffic-analysis transport-detection triton-inference-server yolov8
Last synced: 09 Apr 2025
https://github.com/mingj2021/segment-anything-tensorrt
large-model-export segment-anything tensorrt-inference
Last synced: 20 Mar 2025
https://github.com/BlueMirrors/Yolov5-TensorRT
Yolov5 TensorRT Implementations
tensorrt tensorrt-conversion tensorrt-inference yolov5
Last synced: 21 Apr 2025
https://github.com/bluemirrors/yolov5-tensorrt
Yolov5 TensorRT Implementations
tensorrt tensorrt-conversion tensorrt-inference yolov5
Last synced: 11 Apr 2025
https://github.com/emptysoal/TensorRT-YOLOv8
Based on tensorrt v8.0+, deploy detect, pose, segment, tracking of YOLOv8 with C++ and python api.
bytetrack cpp deep-learning detection pose python3 segmentation tensorrt tensorrt-conversion tensorrt-inference tracking yolov8
Last synced: 15 Dec 2024
https://github.com/qengineering/yolov8-tensorrt-jetson_nano
A lightweight C++ implementation of YoloV8 running on NVIDIAs TensorRT engine
jetson jetson-nano jetson-orin jetson-orin-nano nvidia tensorrt tensorrt-engine tensorrt-inference yolov5 yolov8 yolov8s
Last synced: 13 Apr 2025
https://github.com/lona-cn/vision-simple
a lightweight C++ cross-platform vision inference library,support YOLOv10 YOLOv11 PaddleOCR EasyOCR ,using ONNXRuntime/TVM with multiple exectuion providers.
cuda directml easyocr ocr onnxruntime paddleocr tensorrt-inference tvm yolo
Last synced: 02 Feb 2025
https://github.com/parlaynu/inference-tensorrt
Convert ONNX models to TensorRT engines and run inference in containerized environments
docker jetson-nano nvidia-gpu onnx python pyzmq tensorrt-inference zeromq
Last synced: 09 May 2025
https://github.com/maximedebarbat/dolphin
Dolphin is a python toolkit meant to speed up inference of TensorRT by providing CUDA-Accelerated processing.
cuda python tensorrt-inference
Last synced: 15 Apr 2025
https://github.com/cuixing158/yolo-tensorrt-cpp
部署量化库,适合pc,jetson,int8量化, yolov3/v4/v5
tensorrt tensorrt-engine tensorrt-inference yolov3 yolov4 yolov5
Last synced: 28 Feb 2025
https://github.com/princep/tensorrt-sample-on-threads
A tutorial for getting started on running Tensorrt engine and Deep Learning Accelerator (DLA) models on threads
cpp deep-learning-accelerator dla mnist nvcc tensorrt tensorrt-inference threads
Last synced: 23 Feb 2025
https://github.com/asitiaf/llm-getting-started
Practical, beginner-friendly LLM projects using Python, LangChain, and LangSmith. Modular, reusable, and easy to run.
ai chatgpt cloudnative course emacs genai jupyter-notebook obsidian-md productivity python self-hosted stt tensorrt-inference whatsapp-ai
Last synced: 30 Mar 2025
https://github.com/ce-dric/tensorrt-batch
TensorRT capable of processing batch units
docker docker-container python pytorch tensorrt tensorrt-engine tensorrt-inference
Last synced: 05 Apr 2025
https://github.com/mohamedsamirx/yolov12-tensorrt-cpp
YOLOv12 Inference Using CPP, Tensorrt, And CUDA
cpp cuda tensorrt tensorrt-inference yolo yolov12
Last synced: 22 Feb 2025
https://github.com/littletomatodonkey/model_inference
不同backend的模型转换与推理代码
onnx onnx-convertation onnxruntime pytorch pytorch-converter tensorrt tensorrt-conversion tensorrt-inference
Last synced: 20 Mar 2025