https://github.com/justinchuby/onnx-safetensors
Use safetensors with ONNX 🤗
https://github.com/justinchuby/onnx-safetensors
onnx onnx-models safetensors
Last synced: 6 months ago
JSON representation
Use safetensors with ONNX 🤗
- Host: GitHub
- URL: https://github.com/justinchuby/onnx-safetensors
- Owner: justinchuby
- License: mit
- Created: 2023-08-01T15:15:30.000Z (over 2 years ago)
- Default Branch: main
- Last Pushed: 2025-03-04T17:02:17.000Z (8 months ago)
- Last Synced: 2025-04-09T07:49:52.721Z (7 months ago)
- Topics: onnx, onnx-models, safetensors
- Language: Python
- Homepage: https://pypi.org/project/onnx-safetensors/
- Size: 62.5 KB
- Stars: 50
- Watchers: 5
- Forks: 5
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# onnx-safetensors
[](https://github.com/justinchuby/onnx-safetensors/actions/workflows/main.yml)
[](https://pypi.org/project/onnx-safetensors)
[](https://pypi.org/project/onnx-safetensors)
[](https://github.com/astral-sh/ruff)
[](https://github.com/psf/black)
ONNX extension for saving to and loading from safetensors 🤗.
## Features
- ✅ Load and save ONNX weights from and to safetensors
- ✅ Support all ONNX data types, including float8, float4 and 4-bit ints
- ✅ Allow ONNX backends (including ONNX Runtime) to use safetensors
## Install
```sh
pip install --upgrade onnx-safetensors
```
## Usage
### Load tensors to an ONNX model
> [!TIP]
> You can use safetensors as external data for ONNX.
```python
import os
import onnx
import onnx_safetensors
# Provide your ONNX model here
model: onnx.ModelProto
tensor_file = "path/to/onnx_model/model.safetensors"
base_dir = "path/to/onnx_model"
data_path = "model.safetensors"
# Apply weights from the safetensors file to the model and turn them to in memory tensor
# NOTE: If model size becomes >2GB you will need to offload weights with onnx_safetensors.save_file, or onnx.save with external data options to keep the onnx model valid
model = onnx_safetensors.load_file(model, tensor_file)
# If you want to use the safetensors file in ONNX Runtime:
# Use safetensors as external data in the ONNX model
model_with_external_data = onnx_safetensors.load_file_as_external_data(model, data_path, base_dir=base_dir)
# Save the modified model
# This model is a valid ONNX model using external data from the safetensors file
onnx.save(model_with_external_data, os.path.join(base_dir, "model_using_safetensors.onnx"))
```
### Save weights to a safetensors file
```python
import onnx
import onnx_safetensors
# Provide your ONNX model here
model: onnx.ModelProto
base_dir = "path/to/onnx_model"
data_path = "model.safetensors"
# Offload weights from ONNX model to safetensors file without changing the model
onnx_safetensors.save_file(model, data_path, base_dir=base_dir, replace_data=False) # Generates model.safetensors
# If you want to use the safetensors file in ONNX Runtime:
# Offload weights from ONNX model to safetensors file and use it as external data for the model by setting replace_data=True
model_with_external_data = onnx_safetensors.save_file(model, data_path, base_dir=base_dir, replace_data=True)
# Save the modified model
# This model is a valid ONNX model using external data from the safetensors file
onnx.save(model_with_external_data, os.path.join(base_dir, "model_using_safetensors.onnx"))
```
## Examples
- [Tutorial notebook](examples/tutorial.ipynb)