Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/RidgeRun/gst-inference
A GStreamer Deep Learning Inference Framework
https://github.com/RidgeRun/gst-inference
artificial-intelligence deep-learning gst gstreamer inference machine-learning movidius ncsdk onnx onnxruntime tensorflow tensorrt
Last synced: 3 months ago
JSON representation
A GStreamer Deep Learning Inference Framework
- Host: GitHub
- URL: https://github.com/RidgeRun/gst-inference
- Owner: RidgeRun
- License: lgpl-2.1
- Created: 2018-10-24T06:33:07.000Z (about 6 years ago)
- Default Branch: master
- Last Pushed: 2023-11-07T23:23:05.000Z (about 1 year ago)
- Last Synced: 2024-06-27T18:53:35.146Z (5 months ago)
- Topics: artificial-intelligence, deep-learning, gst, gstreamer, inference, machine-learning, movidius, ncsdk, onnx, onnxruntime, tensorflow, tensorrt
- Language: C
- Homepage:
- Size: 895 KB
- Stars: 119
- Watchers: 37
- Forks: 29
- Open Issues: 45
-
Metadata Files:
- Readme: README.md
- License: COPYING
Awesome Lists containing this project
- awesome-gstreamer - Gst-Inference. RidgeRun
README
GstInference | Coral from Google
:-------------------------:|:-------------------------:
[](https://developer.ridgerun.com/wiki/index.php?title=GstInference) | [](https://coral.ai/products/#prototyping-products)# GstInference
>See the **[GstInference wiki](https://developer.ridgerun.com/wiki/index.php?title=GstInference)** for the complete documentation.
GstInference is an open-source project from Ridgerun Engineering that provides a framework for integrating deep learning inference into GStreamer. Either use one of the included elements to do out-of-the box inference using the most popular deep learning architectures, or leverage the base classes and utilities to support your own custom architecture.
This repo uses **[R²Inference](https://github.com/RidgeRun/r2inference)**, an abstraction layer in C/C++ for a variety of machine learning frameworks. With R²Inference a single C/C++ application may work with models on different frameworks. This is useful to execute inference taking advantage of different hardware resources such as CPU, GPU, or AI optimized acelerators.
GstInference provides several example elements for common applications, such as [`Inception v4`](ext/r2inference/gstinceptionv4.c) for image classification and [`TinyYOLO v2`](ext/r2inference/gsttinyyolov2.c) for object detection. Examples are provided for performing inference on any GStreamer video stream.
## Installing GstInference
Follow the steps to get GstInference running on your platform:
* [Clone or download R²Inference](https://github.com/RidgeRun/r2inference)
* [Build R²Inference](https://developer.ridgerun.com/wiki/index.php?title=R2Inference/Getting_started/Building_the_library)
* [Clone or download GstInference](https://github.com/RidgeRun/gst-inference)
* [Build GstInference](https://developer.ridgerun.com/wiki/index.php?title=GstInference/Getting_started/Building_the_plugin)## Examples
We provide GStreamer [example pipelines](https://developer.ridgerun.com/wiki/index.php?title=GstInference/Example_pipelines) for all our suported platforms,architectures and backends.
We also provide [example applications](https://developer.ridgerun.com/wiki/index.php?title=GstInference/Example_Applications) for classification and detection.
We also provide example trained models on our [model zoo](https://developer.ridgerun.com/wiki/index.php?title=GstInference/Model_Zoo)