https://github.com/pinto0309/sbi4onnx
A very simple script that only initializes the batch size of ONNX. Simple Batchsize Initialization for ONNX.
https://github.com/pinto0309/sbi4onnx
cli model-converter models onnx python
Last synced: 5 months ago
JSON representation
A very simple script that only initializes the batch size of ONNX. Simple Batchsize Initialization for ONNX.
- Host: GitHub
- URL: https://github.com/pinto0309/sbi4onnx
- Owner: PINTO0309
- License: mit
- Created: 2022-05-02T10:12:56.000Z (over 3 years ago)
- Default Branch: main
- Last Pushed: 2024-05-28T23:43:19.000Z (over 1 year ago)
- Last Synced: 2025-04-16T13:36:38.891Z (6 months ago)
- Topics: cli, model-converter, models, onnx, python
- Language: Python
- Homepage:
- Size: 15 MB
- Stars: 7
- Watchers: 3
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# sbi4onnx
A very simple script that only initializes the batch size of ONNX. **S**imple **B**atchsize **I**nitialization for **ONNX**.https://github.com/PINTO0309/simple-onnx-processing-tools
[](https://pepy.tech/project/sbi4onnx)  [](https://pypi.org/project/sbi4onnx/) [](https://github.com/PINTO0309/sbi4onnx/actions?query=workflow%3ACodeQL)
![]()
# Key concept
- [x] Initializes the ONNX batch size with the specified characters.
- [x] This tool is not a panacea and may fail to initialize models with very complex structures. For example, there is an ONNX that contains a `Reshape` that involves a batch size, or a `Gemm` that contains a batch output other than 1 in the output result.
- [x] A `Reshape` in a graph cannot contain more than two undefined shapes, such as `-1` or `N` or `None` or `unk_*`. Therefore, before initializing the batch size with this tool, make sure that the `Reshape` does not already contain one or more `-1` dimensions. If it already contains undefined dimensions, it may be possible to successfully initialize the batch size by pre-writing the undefined dimensions of the relevant `Reshape` to static values using **[sam4onnx](https://github.com/PINTO0309/sam4onnx)**.## 1. Setup
### 1-1. HostPC
```bash
### option
$ echo export PATH="~/.local/bin:$PATH" >> ~/.bashrc \
&& source ~/.bashrc### run
$ pip install -U onnx \
&& python3 -m pip install -U onnx_graphsurgeon --index-url https://pypi.ngc.nvidia.com \
&& pip install --no-deps -U onnx-simplifier \
&& pip install -U sbi4onnx
```
### 1-2. Docker
https://github.com/PINTO0309/simple-onnx-processing-tools#docker## 2. CLI Usage
```
$ sbi4onnx -husage:
sbi4onnx [-h]
-if INPUT_ONNX_FILE_PATH
-of OUTPUT_ONNX_FILE_PATH
-ics INITIALIZATION_CHARACTER_STRING
[-dos]
[-n]optional arguments:
-h, --help
show this help message and exit.-if INPUT_ONNX_FILE_PATH, --input_onnx_file_path INPUT_ONNX_FILE_PATH
Input onnx file path.-of OUTPUT_ONNX_FILE_PATH, --output_onnx_file_path OUTPUT_ONNX_FILE_PATH
Output onnx file path.-ics INITIALIZATION_CHARACTER_STRING, --initialization_character_string INITIALIZATION_CHARACTER_STRING
String to initialize batch size. "-1" or "N" or "xxx", etc...
Default: '-1'-dos, --disable_onnxsim
Suppress the execution of onnxsim on the backend and dare to leave redundant processing.-n, --non_verbose
Do not show all information logs. Only error logs are displayed.
```## 3. In-script Usage
```python
>>> from sbi4onnx import initialize
>>> help(initialize)Help on function initialize in module sbi4onnx.onnx_batchsize_initialize:
initialize(
input_onnx_file_path: Union[str, NoneType] = '',
onnx_graph: Union[onnx.onnx_ml_pb2.ModelProto, NoneType] = None,
output_onnx_file_path: Union[str, NoneType] = '',
initialization_character_string: Union[str, NoneType] = '-1',
non_verbose: Union[bool, NoneType] = False,
disable_onnxsim: Union[bool, NoneType] = False,
) -> onnx.onnx_ml_pb2.ModelProtoParameters
----------
input_onnx_file_path: Optional[str]
Input onnx file path.
Either input_onnx_file_path or onnx_graph must be specified.
Default: ''onnx_graph: Optional[onnx.ModelProto]
onnx.ModelProto.
Either input_onnx_file_path or onnx_graph must be specified.
onnx_graph If specified, ignore input_onnx_file_path and process onnx_graph.output_onnx_file_path: Optional[str]
Output onnx file path. If not specified, no ONNX file is output.
Default: ''initialization_character_string: Optional[str]
String to initialize batch size. "-1" or "N" or "xxx", etc...
Default: '-1'disable_onnxsim: Optional[bool]
Suppress the execution of onnxsim on the backend and dare to leave redundant processing.
Default: Falsenon_verbose: Optional[bool]
Do not show all information logs. Only error logs are displayed.
Default: FalseReturns
-------
changed_graph: onnx.ModelProto
Changed onnx ModelProto.
```## 4. CLI Execution
```bash
$ sbi4onnx \
--input_onnx_file_path whenet_224x224.onnx \
--output_onnx_file_path whenet_Nx224x224.onnx \
--initialization_character_string N$ sbi4onnx \
--input_onnx_file_path whenet_224x224.onnx \
--output_onnx_file_path whenet_Nx224x224.onnx \
--initialization_character_string -1$ sbi4onnx \
--input_onnx_file_path whenet_224x224.onnx \
--output_onnx_file_path whenet_Nx224x224.onnx \
--initialization_character_string abcdefg
```## 5. In-script Execution
```python
from sbi4onnx import initializeonnx_graph = initialize(
input_onnx_file_path="whenet_224x224.onnx",
output_onnx_file_path="whenet_Nx224x224.onnx",
initialization_character_string="abcdefg",
)# or
onnx_graph = initialize(
onnx_graph=graph,
initialization_character_string="abcdefg",
)
```## 6. Sample
### Before
### After
## 7. Reference
1. https://github.com/onnx/onnx/blob/main/docs/Operators.md
2. https://docs.nvidia.com/deeplearning/tensorrt/onnx-graphsurgeon/docs/index.html
3. https://github.com/NVIDIA/TensorRT/tree/main/tools/onnx-graphsurgeon
4. https://github.com/PINTO0309/simple-onnx-processing-tools
5. https://github.com/PINTO0309/PINTO_model_zoo## 8. Issues
https://github.com/PINTO0309/simple-onnx-processing-tools/issues