https://github.com/pinto0309/sng4onnx
A simple tool that automatically generates and assigns an OP name to each OP in an old format ONNX file.
https://github.com/pinto0309/sng4onnx
deep-learning machine-learning model-converter onnx
Last synced: 7 months ago
JSON representation
A simple tool that automatically generates and assigns an OP name to each OP in an old format ONNX file.
- Host: GitHub
- URL: https://github.com/pinto0309/sng4onnx
- Owner: PINTO0309
- License: mit
- Created: 2022-10-13T14:40:02.000Z (about 3 years ago)
- Default Branch: main
- Last Pushed: 2024-04-22T12:52:43.000Z (over 1 year ago)
- Last Synced: 2024-05-01T15:19:58.492Z (over 1 year ago)
- Topics: deep-learning, machine-learning, model-converter, onnx
- Language: Python
- Homepage:
- Size: 11.7 KB
- Stars: 1
- Watchers: 3
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# sng4onnx
A simple tool that automatically generates and assigns an OP name to each OP in an old format ONNX file.
**S**imple op **N**ame **G**enerator for **ONNX**.https://github.com/PINTO0309/simple-onnx-processing-tools
[](https://pepy.tech/project/sng4onnx)  [](https://pypi.org/project/sng4onnx/) [](https://github.com/PINTO0309/sng4onnx/actions?query=workflow%3ACodeQL)
![]()
# Key concept
- [x] Automatically generates and assigns an OP name to each OP in an old format ONNX file.
## 1. Setup
### 1-1. HostPC
```bash
### option
$ echo export PATH="~/.local/bin:$PATH" >> ~/.bashrc \
&& source ~/.bashrc### run
$ pip install -U onnx \
&& python3 -m pip install -U onnx_graphsurgeon --index-url https://pypi.ngc.nvidia.com \
&& pip install -U sng4onnx
```
### 1-2. Docker
https://github.com/PINTO0309/simple-onnx-processing-tools#docker## 2. CLI Usage
```
$ sng4onnx -husage:
sng4onnx [-h]
-if INPUT_ONNX_FILE_PATH
-of OUTPUT_ONNX_FILE_PATH
[-n]optional arguments:
-h, --help
show this help message and exit.-if INPUT_ONNX_FILE_PATH, --input_onnx_file_path INPUT_ONNX_FILE_PATH
Input onnx file path.-of OUTPUT_ONNX_FILE_PATH, --output_onnx_file_path OUTPUT_ONNX_FILE_PATH
Output onnx file path.-n, --non_verbose
Do not show all information logs. Only error logs are displayed.
```## 3. In-script Usage
```python
>>> from sng4onnx import generate
>>> help(generate)Help on function generate in module sng4onnx.onnx_opname_generator:
generate(
input_onnx_file_path: Union[str, NoneType] = '',
onnx_graph: Union[onnx.onnx_ml_pb2.ModelProto, NoneType] = None,
output_onnx_file_path: Union[str, NoneType] = '',
non_verbose: Union[bool, NoneType] = False
) -> onnx.onnx_ml_pb2.ModelProtoParameters
----------
input_onnx_file_path: Optional[str]
Input onnx file path.
Either input_onnx_file_path or onnx_graph must be specified.
Default: ''onnx_graph: Optional[onnx.ModelProto]
onnx.ModelProto.
Either input_onnx_file_path or onnx_graph must be specified.
onnx_graph If specified, ignore input_onnx_file_path and process onnx_graph.output_onnx_file_path: Optional[str]
Output onnx file path. If not specified, no ONNX file is output.
Default: ''non_verbose: Optional[bool]
Do not show all information logs. Only error logs are displayed.
Default: FalseReturns
-------
renamed_graph: onnx.ModelProto
Renamed onnx ModelProto.
```## 4. CLI Execution
```bash
$ sng4onnx \
--input_onnx_file_path emotion-ferplus-8.onnx \
--output_onnx_file_path emotion-ferplus-8_renamed.onnx
```## 5. In-script Execution
```python
from sng4onnx import generateonnx_graph = generate(
input_onnx_file_path="fusionnet_180x320.onnx",
output_onnx_file_path="fusionnet_180x320_renamed.onnx",
)
```## 6. Sample
https://github.com/onnx/models/blob/main/vision/classification/resnet/model/resnet18-v1-7.onnx
### Before
### After
## 7. Reference
1. https://github.com/onnx/onnx/blob/main/docs/Operators.md
2. https://docs.nvidia.com/deeplearning/tensorrt/onnx-graphsurgeon/docs/index.html
3. https://github.com/NVIDIA/TensorRT/tree/main/tools/onnx-graphsurgeon
4. https://github.com/PINTO0309/simple-onnx-processing-tools
5. https://github.com/PINTO0309/PINTO_model_zoo## 8. Issues
https://github.com/PINTO0309/simple-onnx-processing-tools/issues