Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/vertexclique/orkhon
Orkhon: ML Inference Framework and Server Runtime
https://github.com/vertexclique/orkhon
async data-parallelism inference-server machine-learning multiprocessing python3 tensorflow
Last synced: 10 days ago
JSON representation
Orkhon: ML Inference Framework and Server Runtime
- Host: GitHub
- URL: https://github.com/vertexclique/orkhon
- Owner: vertexclique
- License: mit
- Created: 2019-05-18T23:24:53.000Z (over 5 years ago)
- Default Branch: master
- Last Pushed: 2021-02-01T15:23:47.000Z (almost 4 years ago)
- Last Synced: 2024-10-14T10:15:38.938Z (27 days ago)
- Topics: async, data-parallelism, inference-server, machine-learning, multiprocessing, python3, tensorflow
- Language: Rust
- Homepage:
- Size: 26.2 MB
- Stars: 146
- Watchers: 5
- Forks: 5
- Open Issues: 3
-
Metadata Files:
- Readme: README.md
- Funding: .github/FUNDING.yml
- License: LICENSE
Awesome Lists containing this project
README
-----------------
Orkhon: ML Inference Framework and Server Runtime
## What is it?
Orkhon is Rust framework for Machine Learning to run/use inference/prediction code written in Python, frozen models and process unseen data. It is mainly focused on serving models and processing unseen data in a performant manner. Instead of using Python directly and having scalability problems for servers this framework tries to solve them with built-in async API.
## Main features
* Sync & Async API for models.
* Easily embeddable engine for well-known Rust web frameworks.
* API contract for interacting with Python code.
* High processing throughput
* ~4.8361 GiB/s prediction throughput
* 3_000 concurrent requests takes ~4ms on average
* Python Module caching## Installation
You can include Orkhon into your project with;
```toml
[dependencies]
orkhon = "0.2"
```## Dependencies
You will need:
* If you use `pymodel` feature, Python dev dependencies should be installed and have proper python runtime to use Orkhon with your project.
* If you want to have tensorflow inference. Installing tensorflow as library for linking is required.
* ONNX interface doesn't need extra dependencies from the system side.
* Point out your `PYTHONHOME` environment variable to your Python installation.
## Python API contractFor Python API contract you can take a look at the [Project Documentation](https://docs.rs/orkhon).
## Examples
#### Request a Tensorflow prediction asynchronously```rust
use orkhon::prelude::*;
use orkhon::tcore::prelude::*;
use orkhon::ttensor::prelude::*;
use rand::*;
use std::path::PathBuf;let o = Orkhon::new()
.config(
OrkhonConfig::new()
.with_input_fact_shape(InferenceFact::dt_shape(f32::datum_type(), tvec![10, 100])),
)
.tensorflow(
"model_which_will_be_tested",
PathBuf::from("tests/protobuf/manual_input_infer/my_model.pb"),
)
.shareable();let mut rng = thread_rng();
let vals: Vec<_> = (0..1000).map(|_| rng.gen::()).collect();
let input = tract_ndarray::arr1(&vals).into_shape((10, 100)).unwrap();let o = o.get();
let handle = async move {
let processor = o.tensorflow_request_async(
"model_which_will_be_tested",
ORequest::with_body(TFRequest::new().body(input.into())),
);
processor.await
};
let resp = block_on(handle).unwrap();
```#### Request an ONNX prediction synchronously
This example needs `onnxmodel` feature enabled.
```rust
use orkhon::prelude::*;
use orkhon::tcore::prelude::*;
use orkhon::ttensor::prelude::*;
use rand::*;
use std::path::PathBuf;let o = Orkhon::new()
.config(
OrkhonConfig::new()
.with_input_fact_shape(InferenceFact::dt_shape(f32::datum_type(), tvec![10, 100])),
)
.onnx(
"model_which_will_be_tested",
PathBuf::from("tests/protobuf/onnx_model/example.onnx"),
)
.build();let mut rng = thread_rng();
let vals: Vec<_> = (0..1000).map(|_| rng.gen::()).collect();
let input = tract_ndarray::arr1(&vals).into_shape((10, 100)).unwrap();let resp = o
.onnx_request(
"model_which_will_be_tested",
ORequest::with_body(ONNXRequest::new().body(input.into())),
)
.unwrap();
assert_eq!(resp.body.output.len(), 1);
```## License
License is [MIT](https://github.com/vertexclique/orkhon/blob/master/LICENSE)
## Documentation
Official documentation is hosted on [docs.rs](https://docs.rs/orkhon).
## Getting Help
Please head to our [Gitter](https://gitter.im/orkhonml/community) or use [StackOverflow](https://stackoverflow.com/questions/tagged/orkhon)## Discussion and Development
We use [Gitter](https://gitter.im/orkhonml/community) for development discussions. Also please don't hesitate to open issues on GitHub ask for features, report bugs, comment on design and more!
More interaction and more ideas are better!## Contributing to Orkhon [![Open Source Helpers](https://www.codetriage.com/vertexclique/orkhon/badges/users.svg)](https://www.codetriage.com/vertexclique/orkhon)
All contributions, bug reports, bug fixes, documentation improvements, enhancements and ideas are welcome.
A detailed overview on how to contribute can be found in the [CONTRIBUTING guide](.github/CONTRIBUTING.md) on GitHub.