https://github.com/sdpython/onnxcustom
Tutorial on how to convert machine learned models into ONNX
https://github.com/sdpython/onnxcustom
Last synced: 18 days ago
JSON representation
Tutorial on how to convert machine learned models into ONNX
- Host: GitHub
- URL: https://github.com/sdpython/onnxcustom
- Owner: sdpython
- License: mit
- Created: 2020-06-30T11:33:46.000Z (almost 5 years ago)
- Default Branch: master
- Last Pushed: 2023-03-11T20:52:31.000Z (about 2 years ago)
- Last Synced: 2025-05-06T22:13:46.315Z (18 days ago)
- Language: Python
- Homepage: http://www.xavierdupre.fr/app/onnxcustom/helpsphinx/index.html
- Size: 6.04 MB
- Stars: 16
- Watchers: 3
- Forks: 3
- Open Issues: 5
-
Metadata Files:
- Readme: README.rst
- Changelog: HISTORY.rst
- License: LICENSE.txt
Awesome Lists containing this project
README
.. image:: https://circleci.com/gh/sdpython/onnxcustom/tree/master.svg?style=svg
:target: https://circleci.com/gh/sdpython/onnxcustom/tree/master.. image:: https://travis-ci.com/sdpython/onnxcustom.svg?branch=master
:target: https://app.travis-ci.com/github/sdpython/onnxcustom
:alt: Build status.. image:: https://ci.appveyor.com/api/projects/status/a3sn45a2fayoxb5q?svg=true
:target: https://ci.appveyor.com/project/sdpython/onnxcustom
:alt: Build Status Windows.. image:: https://codecov.io/gh/sdpython/onnxcustom/branch/master/graph/badge.svg
:target: https://codecov.io/gh/sdpython/onnxcustom.. image:: https://badge.fury.io/py/onnxcustom.svg
:target: http://badge.fury.io/py/onnxcustom.. image:: http://img.shields.io/github/issues/sdpython/onnxcustom.png
:alt: GitHub Issues
:target: https://github.com/sdpython/onnxcustom/issues.. image:: https://img.shields.io/badge/license-MIT-blue.svg
:alt: MIT License
:target: http://opensource.org/licenses/MIT.. image:: https://pepy.tech/badge/onnxcustom/month
:target: https://pepy.tech/project/onnxcustom/month
:alt: Downloads.. image:: https://img.shields.io/github/forks/sdpython/onnxcustom.svg
:target: https://github.com/sdpython/onnxcustom/
:alt: Forks.. image:: https://img.shields.io/github/stars/sdpython/onnxcustom.svg
:target: https://github.com/sdpython/onnxcustom/
:alt: Stars.. image:: https://img.shields.io/github/repo-size/sdpython/onnxcustom
:target: https://github.com/sdpython/onnxcustom/
:alt: sizeonnxcustom: custom ONNX
=======================.. image:: https://raw.githubusercontent.com/sdpython/onnxcustom/master/_doc/sphinxdoc/source/_static/project_ico.png
:width: 50`documentation `_
Examples, tutorial on how to convert machine learned models into ONNX,
implement your own converter or runtime, or even train with ONNX / onnxruntime.The function *check* or the command line ``python -m onnxcustom check``
checks the module is properly installed and returns processing
time for a couple of functions or simply:::
import onnxcustom
onnxcustom.check()The documentation also introduces *onnx*, *onnxruntime* for
inference and training.
The tutorial related to *scikit-learn*
has been merged into `sklearn-onnx documentation
`_.
Among the tools this package implements, you may find:* a tool to convert NVidia Profilder logs into a dataframe,
* a SGD optimizer similar to what *scikit-learn* implements but
based on *onnxruntime-training* and able to train an CPU and GPU,
* functions to manipulate *onnx* graph.**Installation of onnxruntime-training**
onnxruntime-training is only available on Linux. The CPU
can be installed with the following instruction.::
pip install onnxruntime-training --extra-index-url https://download.onnxruntime.ai/onnxruntime_nightly_cpu.html
Versions using GPU with CUDA or ROCm are available. Check
`download.onnxruntime.ai `_
to find a specific version.
You can use it on Windows
inside WSL (Windows Linux Subsystem) or compile it for CPU:::
python tools\ci_build\build.py --skip_tests --build_dir .\build\Windows --config Release --build_shared_lib --build_wheel --numpy_version= --cmake_generator="Visual Studio 16 2019" --enable_training --enable_training_ops
GPU versions work better on WSL, see `Build onnxruntime on WSL (Windows Linux Subsystem)
`_.
*onnxcustom* can be installed from pypi.