Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/gtbluesky/onnxruntime_flutter
A flutter plugin for OnnxRuntime provides an easy, flexible, and fast Dart API to integrate Onnx models in flutter apps across mobile and desktop platforms.
https://github.com/gtbluesky/onnxruntime_flutter
dart ffi flutter onnx onnxruntime pytorch tensorflow
Last synced: about 13 hours ago
JSON representation
A flutter plugin for OnnxRuntime provides an easy, flexible, and fast Dart API to integrate Onnx models in flutter apps across mobile and desktop platforms.
- Host: GitHub
- URL: https://github.com/gtbluesky/onnxruntime_flutter
- Owner: gtbluesky
- License: mit
- Created: 2023-08-30T08:28:38.000Z (about 1 year ago)
- Default Branch: main
- Last Pushed: 2024-06-28T09:16:33.000Z (4 months ago)
- Last Synced: 2024-06-28T10:39:02.770Z (4 months ago)
- Topics: dart, ffi, flutter, onnx, onnxruntime, pytorch, tensorflow
- Language: C++
- Homepage:
- Size: 42.9 MB
- Stars: 53
- Watchers: 5
- Forks: 11
- Open Issues: 12
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
- License: LICENSE
Awesome Lists containing this project
README
# OnnxRuntime Plugin
[![pub package](https://img.shields.io/pub/v/onnxruntime.svg)](https://pub.dev/packages/onnxruntime)## Overview
Flutter plugin for OnnxRuntime via `dart:ffi` provides an easy, flexible, and fast Dart API to integrate Onnx models in flutter apps across mobile and desktop platforms.
| **Platform** | Android | iOS | Linux | macOS | Windows |
|-------------------|---------------|-----|-------|-------|---------|
| **Compatibility** | API level 21+ | * | * | * | * |
| **Architecture** | arm32/arm64 | * | * | * | * |*: [Consistent with Flutter](https://docs.flutter.dev/reference/supported-platforms)
## Key Features
* Multi-platform Support for Android, iOS, Linux, macOS, Windows, and Web(Coming soon).
* Flexibility to use any Onnx Model.
* Acceleration using multi-threading.
* Similar structure as OnnxRuntime Java and C# API.
* Inference speed is not slower than native Android/iOS Apps built using the Java/Objective-C API.
* Run inference in different isolates to prevent jank in UI thread.## Getting Started
In your flutter project add the dependency:
```yml
dependencies:
...
onnxruntime: x.y.z
```## Usage example
### Import
```dart
import 'package:onnxruntime/onnxruntime.dart';
```### Initializing environment
```dart
OrtEnv.instance.init();
```### Creating the Session
```dart
final sessionOptions = OrtSessionOptions();
const assetFileName = 'assets/models/test.onnx';
final rawAssetFile = await rootBundle.load(assetFileName);
final bytes = rawAssetFile.buffer.asUint8List();
final session = OrtSession.fromBuffer(bytes, sessionOptions!);
```### Performing inference
```dart
final shape = [1, 2, 3];
final inputOrt = OrtValueTensor.createTensorWithDataList(data, shape);
final inputs = {'input': inputOrt};
final runOptions = OrtRunOptions();
final outputs = await _session?.runAsync(runOptions, inputs);
inputOrt.release();
runOptions.release();
outputs?.forEach((element) {
element?.release();
});
```### Releasing environment
```dart
OrtEnv.instance.release();
```