https://github.com/nvidia-ai-iot/jetson_benchmarks
Jetson Benchmark
https://github.com/nvidia-ai-iot/jetson_benchmarks
benchmark jetson jetson-nano xavier-nx
Last synced: 3 months ago
JSON representation
Jetson Benchmark
- Host: GitHub
- URL: https://github.com/nvidia-ai-iot/jetson_benchmarks
- Owner: NVIDIA-AI-IOT
- License: mit
- Created: 2020-04-23T03:14:54.000Z (over 5 years ago)
- Default Branch: master
- Last Pushed: 2023-12-01T20:43:18.000Z (almost 2 years ago)
- Last Synced: 2024-05-14T00:14:42.690Z (over 1 year ago)
- Topics: benchmark, jetson, jetson-nano, xavier-nx
- Language: Python
- Homepage:
- Size: 38.1 KB
- Stars: 340
- Watchers: 13
- Forks: 70
- Open Issues: 24
-
Metadata Files:
- Readme: README.md
- License: LICENSE.md
Awesome Lists containing this project
README
# Benchmarks Targeted for Jetson (Using GPU+2DLA)
The script will run following Benchmarks:
- Names : Input Image Resolution
- Inception V4 : 299x299
- ResNet-50 : 224x224
- OpenPose : 256x456
- VGG-19 : 224x224
- YOLO-V3 : 608x608
- Super Resolution : 481x321
- Unet : 256x256For benchmark results on all NVIDIA Jetson Products; please have a look at [NVIDIA jetson_benchmark webpage](https://developer.nvidia.com/embedded/jetson-benchmarks)
Following scripts are included:
1. Installation requirements for running benchmark script (install_requirements.sh)
2. CSV files containing parameters (benchmark_csv folder)
3. Download Model (utils/download_models.py)
4. Running Benchmark Script (benchmarks.py)### Version Dependencies:
- JetPack 4.4+
- TensorRT 7+### Set up instructions
``` git clone https://github.com/NVIDIA-AI-IOT/jetson_benchmarks.git```
``` cd jetson_benchmarks ```
``` mkdir models ``` # Open folder to store models (Optional)### Install Requirements
``` sudo sh install_requirements.sh```
Note: All libraries will be installed for ```python3```# For Jetson Xavier NX
### Download Models
``` python3 utils/download_models.py --all --csv_file_path /benchmark_csv/nx-benchmarks.csv --save_dir ```### Running Benchmarks
#### Running All Benchmark Models at Once
``` sudo python3 benchmark.py --all --csv_file_path /benchmark_csv/nx-benchmarks.csv --model_dir ```#### Sample Output
| **Model Name** | **FPS** |
| :--- | :--- |
| inception_v4 | 311.73 |
| vgg19_N2 | 66.43 |
| super_resolution_bsd500 | 150.46 |
| unet-segmentation | 145.42 |
| pose_estimation | 237.1 |
| yolov3-tiny-416 | 546.69 |
| ResNet50_224x224 | 824.02 |
| ssd-mobilenet-v1 | 887.6 |
#### Running an Individual Benchmark Model
1. For Inception V4
``` sudo python3 benchmark.py --model_name inception_v4 --csv_file_path /benchmark_csv/nx-benchmarks.csv --model_dir ```2. For VGG19
``` sudo python3 benchmark.py --model_name vgg19 --csv_file_path /benchmark_csv/nx-benchmarks.csv --model_dir ```3. For Super Resolution
``` sudo python3 benchmark.py --model_name super_resolution --csv_file_path /benchmark_csv/nx-benchmarks.csv --model_dir ```4. For UNET Segmentation
``` sudo python3 benchmark.py --model_name unet --csv_file_path /benchmark_csv/nx-benchmarks.csv --model_dir ```5. For Pose Estimation
``` sudo python3 benchmark.py --model_name pose_estimation --csv_file_path /benchmark_csv/nx-benchmarks.csv --model_dir ```6. For Tiny-YOLO-V3
``` sudo python3 benchmark.py --model_name tiny-yolov3 --csv_file_path /benchmark_csv/nx-benchmarks.csv --model_dir ```7. For ResNet-50
``` sudo python3 benchmark.py --model_name resnet --csv_file_path /benchmark_csv/nx-benchmarks.csv --model_dir ```8. For SSD-MobileNet-V1 Segmentation
``` sudo python3 benchmark.py --model_name ssd-mobilenet-v1 --csv_file_path /benchmark_csv/nx-benchmarks.csv --model_dir ```# For Jetson AGX Xavier
Please follow setup and installation requirements.### Download Models
``` python3 utils/download_models.py --all --csv_file_path /benchmark_csv/xavier-benchmarks.csv --save_dir ```### Running All Benchmark Models at Once on Jetson AGX Xavier
```
sudo python3 benchmark.py --all --csv_file_path /benchmark_csv/xavier-benchmarks.csv \
--model_dir \
--jetson_devkit xavier \
--gpu_freq 1377000000 --dla_freq 1395200000 --power_mode 0 --jetson_clocks
```# For Jetson TX2 and Jetson Nano
Please follow setup and installation requirements.### Download Models
``` python3 utils/download_models.py --all --csv_file_path /benchmark_csv/tx2-nano-benchmarks.csv --save_dir ```### Running All Benchmark Models at Once on Jetson TX2
```
sudo python3 benchmark.py --all --csv_file_path /benchmark_csv/tx2-nano-benchmarks.csv \
--model_dir \
--jetson_devkit tx2 \
--gpu_freq 1122000000 --power_mode 3 --precision fp16
```### Running All Benchmark Models at Once on Jetson Nano
```
sudo python3 benchmark.py --all --csv_file_path /benchmark_csv/tx2-nano-benchmarks.csv \
--model_dir \
--jetson_devkit nano \
--gpu_freq 921600000 --power_mode 0 --precision fp16
```# For Jetson Orin
Please follow setup and installation requirements.### Download Models
``` python3 utils/download_models.py --all --csv_file_path /benchmark_csv/orin-benchmarks.csv --save_dir ```### Running All Benchmark Models at Once on Orin
Please check if Orin is in MAX power mode:
`nvpmodel -q`
It should say:
```
NV Power Mode: MAXN
0
```If it isn't, set it to MAX power mode:
`sudo nvpmodel -m 0`. Then run:```
sudo python3 benchmark.py --all --csv_file_path /benchmark_csv/orin-benchmarks.csv \
--model_dir
```# For Jetson Orin Nano
Please follow setup and installation requirements.### Download Models
``` python3 utils/download_models.py --all --csv_file_path /benchmark_csv/orin-nano-benchmarks.csv --save_dir ```### Running All Benchmark Models at Once on Orin Nano
```
sudo python3 benchmark.py --all --csv_file_path /benchmark_csv/orin-nano-benchmarks.csv \
--model_dir --jetson_clocks
```# For Jetson Orin NX
Please follow setup and installation requirements.### Download Models for Orin NX 8GB
``` python3 utils/download_models.py --all --csv_file_path /benchmark_csv/orin-nx-8gb-benchmarks.csv --save_dir ```### Running All Benchmark Models at Once on Orin NX 8GB
```
sudo python3 benchmark.py --all --csv_file_path /benchmark_csv/orin-nx-8gb-benchmarks.csv \
--model_dir
```### Download Models for Orin NX 16GB
``` python3 utils/download_models.py --all --csv_file_path /benchmark_csv/orin-nx-16gb-benchmarks.csv --save_dir ```### Running All Benchmark Models at Once on Orin NX 16GB
```
sudo python3 benchmark.py --all --csv_file_path /benchmark_csv/orin-nx-16gb-benchmarks.csv \
--model_dir
```