{"id":13438575,"url":"https://github.com/iwatake2222/InferenceHelper_Sample","last_synced_at":"2025-03-20T06:30:45.716Z","repository":{"id":42898173,"uuid":"324740159","full_name":"iwatake2222/InferenceHelper_Sample","owner":"iwatake2222","description":"Sample projects for InferenceHelper, a Helper Class for Deep Learning Inference Frameworks: TensorFlow Lite, TensorRT, OpenCV, ncnn, MNN, SNPE, Arm NN, NNabla, ONNX Runtime, LibTorch, TensorFlow","archived":false,"fork":false,"pushed_at":"2022-03-27T07:33:09.000Z","size":1384,"stargazers_count":20,"open_issues_count":0,"forks_count":5,"subscribers_count":4,"default_branch":"master","last_synced_at":"2024-10-28T00:23:14.331Z","etag":null,"topics":["cpp","deeplearning","mnn","ncnn","opencv","tensorflow","tensorrt"],"latest_commit_sha":null,"homepage":"","language":"C++","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/iwatake2222.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2020-12-27T10:47:54.000Z","updated_at":"2024-05-30T01:30:59.000Z","dependencies_parsed_at":"2022-09-24T07:01:43.312Z","dependency_job_id":null,"html_url":"https://github.com/iwatake2222/InferenceHelper_Sample","commit_stats":null,"previous_names":[],"tags_count":5,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/iwatake2222%2FInferenceHelper_Sample","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/iwatake2222%2FInferenceHelper_Sample/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/iwatake2222%2FInferenceHelper_Sample/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/iwatake2222%2FInferenceHelper_Sample/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/iwatake2222","download_url":"https://codeload.github.com/iwatake2222/InferenceHelper_Sample/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":244564965,"owners_count":20473166,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["cpp","deeplearning","mnn","ncnn","opencv","tensorflow","tensorrt"],"created_at":"2024-07-31T03:01:06.568Z","updated_at":"2025-03-20T06:30:40.707Z","avatar_url":"https://github.com/iwatake2222.png","language":"C++","readme":"# InferenceHelper_Sample\r\n- Sample project for Inference Helper (https://github.com/iwatake2222/InferenceHelper )\r\n- Run a simple classification model (MobileNetv2) using several deep leraning frameworks\r\n\r\n[![CI Windows](https://github.com/iwatake2222/InferenceHelper_Sample/actions/workflows/ci_windows.yml/badge.svg)](https://github.com/iwatake2222/InferenceHelper_Sample/actions/workflows/ci_windows.yml)\r\n[![CI Ubuntu](https://github.com/iwatake2222/InferenceHelper_Sample/actions/workflows/ci_ubuntu.yml/badge.svg)](https://github.com/iwatake2222/InferenceHelper_Sample/actions/workflows/ci_ubuntu.yml)\r\n[![CI Arm](https://github.com/iwatake2222/InferenceHelper_Sample/actions/workflows/ci_arm.yml/badge.svg)](https://github.com/iwatake2222/InferenceHelper_Sample/actions/workflows/ci_arm.yml)\r\n[![CI Android](https://github.com/iwatake2222/InferenceHelper_Sample/actions/workflows/ci_android.yml/badge.svg)](https://github.com/iwatake2222/InferenceHelper_Sample/actions/workflows/ci_android.yml)\r\n\r\n![Class Diagram](00_doc/class_diagram.png) \r\n\r\n## Usage\r\n```\r\n./main [input]\r\n\r\n - option description: [input]\r\n    - blank\r\n        - use the default image file set in source code (main.cpp)\r\n        - e.g. `./main`\r\n     - *.mp4, *.avi, *.webm\r\n        - use video file\r\n        - e.g. `./main test.mp4`\r\n    - *.jpg, *.png, *.bmp\r\n        - use image file\r\n        - e.g. `./main test.jpg`\r\n    - number (e.g. 0, 1, 2, ...)\r\n        - use camera\r\n        - e.g. `./main 0`\r\n```\r\n\r\n## How to build a sample project\r\n### 0. Requirements\r\n- OpenCV 4.x\r\n\r\n### 1. Download \r\n- Get source code\r\n    ```sh\r\n    git clone https://github.com/iwatake2222/InferenceHelper_Sample\r\n    cd InferenceHelper_Sample\r\n    git submodule update --init\r\n    sh InferenceHelper/third_party/download_prebuilt_libraries.sh\r\n    ```\r\n    - If you have a problem, please refer to  https://github.com/iwatake2222/InferenceHelper#installation\r\n    - If your host PC is Windows but you want to build/run on Linux (like WSL2), it's better to run this script on the target OS(Linux). Otherwise, symbolic link will broken.\r\n- Download models\r\n    ```sh\r\n    sh ./download_resource.sh\r\n    ```\r\n\r\n### 2-a. Build in Linux (PC Ubuntu, Raspberry Pi, Jetson Nano, etc.)\r\n```sh\r\ncd pj_cls_mobilenet_v2\r\nmkdir -p build \u0026\u0026 cd build\r\ncmake .. -DINFERENCE_HELPER_ENABLE_MNN=on\r\nmake\r\n./main\r\n```\r\n\r\n### 2-b. Build in Windows (Visual Studio)\r\n- Configure and Generate a new project using cmake-gui for Visual Studio 2019 64-bit\r\n    - `Where is the source code` : path-to-InferenceHelper_Sample/pj_cls_mobilenet_v2\r\n    - `Where to build the binaries` : path-to-build\t(any)\r\n    - Check one of the liseted InferenceHelperFramework (e.g. `INFERENCE_HELPER_ENABLE_MNN` )\r\n- Open `main.sln`\r\n- Set `main` project as a startup project, then build and run!\r\n\r\n### 2-c. Build in Linux (Cross compile for armv7 and aarch64)\r\n```\r\nsudo apt install g++-arm-linux-gnueabi g++-arm-linux-gnueabihf g++-aarch64-linux-gnu\r\n\r\nexport CC=aarch64-linux-gnu-gcc\r\nexport CXX=aarch64-linux-gnu-g++\r\ncmake .. -DBUILD_SYSTEM=aarch64 -DINFERENCE_HELPER_ENABLE_MNN=on\r\n\r\nexport CC=arm-linux-gnueabi-gcc\r\nexport CXX=arm-linux-gnueabi-g++\r\ncmake .. -DBUILD_SYSTEM=armv7 -DINFERENCE_HELPER_ENABLE_MNN=on\r\n```\r\n\r\nYou need to link appropreate OpenCV.\r\n\r\n### 2-d. Build in Android Studio\r\n- Requirements\r\n    - Android Studio\r\n        - Compile Sdk Version\r\n            - 30\r\n        - Build Tools version\r\n            - 30.0.0\r\n        - Target SDK Version\r\n            - 30\r\n        - Min SDK Version\r\n            - 24\r\n            - With 23, I got the following error\r\n                - `bionic/libc/include/bits/fortify/unistd.h:174: undefined reference to `__write_chk'`\r\n                - https://github.com/android/ndk/issues/1179\r\n    - Android NDK\r\n        - 23.1.7779620\r\n    - OpenCV\r\n        - opencv-4.3.0-android-sdk.zip\r\n    - *The version is just the version I used\r\n\r\n- Configure NDK\r\n    - File -\u003e Project Structure -\u003e SDK Location -\u003e Android NDK location (before Android Studio 4.0)\r\n        - C:\\Users\\abc\\AppData\\Local\\Android\\Sdk\\ndk\\21.3.6528147\r\n    - Modify `local.properties` to specify `sdk.dir` and `ndk.dir`  (after Android Studio 4.1)\r\n        ```\r\n        sdk.dir=C\\:\\\\Users\\\\xxx\\\\AppData\\\\Local\\\\Android\\\\Sdk\r\n        ndk.dir=C\\:\\\\Users\\\\xxx\\\\AppData\\\\Local\\\\Android\\\\sdk\\\\ndk\\\\23.1.7779620\r\n        ```\r\n\r\n- Import OpenCV\r\n    - Download and extract OpenCV android-sdk (https://github.com/opencv/opencv/releases )\r\n    - File -\u003e New -\u003e Import Module\r\n        - path-to-opencv\\opencv-4.3.0-android-sdk\\OpenCV-android-sdk\\sdk\r\n    - FIle -\u003e Project Structure -\u003e Dependencies -\u003e app -\u003e Declared Dependencies -\u003e + -\u003e Module Dependencies\r\n        - select sdk\r\n    - In case you cannot import OpenCV module, remove sdk module and dependency of app to sdk in Project Structure\r\n- Note: To avoid saving modified settings, use the following command\r\n    - `git update-index --skip-worktree ViewAndroid/app/build.gradle ViewAndroid/settings.gradle ViewAndroid/.idea/gradle.xml` \r\n- Copy `resource` directory to `/storage/emulated/0/Android/data/com.iwatake.viewandroidinferencehelpersample/files/Documents/resource`\r\n    - the directory will be created after running the app (so the first run should fail because model files cannot be read)\r\n- Modify `ViewAndroid\\app\\src\\main\\cpp\\CMakeLists.txt` to select a image processor you want to use\r\n    - `set(ImageProcessor_DIR \"${CMAKE_CURRENT_LIST_DIR}/../../../../../pj_cls_mobilenet_v2/image_processor\")`\r\n    - replace `pj_cls_mobilenet_v2` to another\r\n\r\n## Note\r\n### Options: Select Deep Leraning framework\r\n- Choose one of the following options.\r\n    - *Note* : InferenceHelper itself supports multiple frameworks (i.e. you can set `on` for several frameworks). However, in this sample project the selected framework is also used to `create` InferenceHelper instance for the sake of ease. \r\n    - *Note* : When you change an option, it's safer to clean the project before you re-run cmake\r\n\r\n```sh\r\ncmake .. -DINFERENCE_HELPER_ENABLE_OPENCV=on\r\ncmake .. -DINFERENCE_HELPER_ENABLE_TFLITE=on\r\ncmake .. -DINFERENCE_HELPER_ENABLE_TFLITE_DELEGATE_XNNPACK=on\r\ncmake .. -DINFERENCE_HELPER_ENABLE_TFLITE_DELEGATE_GPU=on\r\ncmake .. -DINFERENCE_HELPER_ENABLE_TFLITE_DELEGATE_EDGETPU=on\r\ncmake .. -DINFERENCE_HELPER_ENABLE_TFLITE_DELEGATE_NNAPI=on\r\ncmake .. -DINFERENCE_HELPER_ENABLE_TENSORRT=on\r\ncmake .. -DINFERENCE_HELPER_ENABLE_NCNN=on\r\ncmake .. -DINFERENCE_HELPER_ENABLE_MNN=on\r\ncmake .. -DINFERENCE_HELPER_ENABLE_SNPE=on\r\ncmake .. -DINFERENCE_HELPER_ENABLE_ARMNN=on\r\ncmake .. -DINFERENCE_HELPER_ENABLE_NNABLA=on\r\ncmake .. -DINFERENCE_HELPER_ENABLE_NNABLA_CUDA=on\r\ncmake .. -DINFERENCE_HELPER_ENABLE_ONNX_RUNTIME=on\r\ncmake .. -DINFERENCE_HELPER_ENABLE_ONNX_RUNTIME_CUDA=on\r\ncmake .. -DINFERENCE_HELPER_ENABLE_LIBTORCH=on\r\ncmake .. -DINFERENCE_HELPER_ENABLE_LIBTORCH_CUDA=on\r\ncmake .. -DINFERENCE_HELPER_ENABLE_TENSORFLOW=on\r\ncmake .. -DINFERENCE_HELPER_ENABLE_TENSORFLOW_GPU=on\r\n```\r\n\r\n### Note: TensorFlow Lite + EdgeTPU\r\n- You may need something like the following commands to run the app\r\n    ```sh\r\n    cp libedgetpu.so.1.0 libedgetpu.so.1\r\n    sudo LD_LIBRARY_PATH=./ ./main\r\n    ```\r\n\r\n### Note: ncnn\r\n- Build for Android\r\n    - In case you encounter `error: use of typeid requires -frtti` error, modify `ViewAndroid\\sdk\\native\\jni\\include\\opencv2\\opencv_modules.hpp`\r\n        - `//#define HAVE_OPENCV_FLANN`\r\n\r\n# License\r\n- InferenceHelper_Sample\r\n- https://github.com/iwatake2222/InferenceHelper_Sample\r\n- Copyright 2020 iwatake2222\r\n- Licensed under the Apache License, Version 2.0\r\n\r\n# Acknowledgements\r\n- This project utilizes OSS (Open Source Software)\r\n    - [NOTICE.md](NOTICE.md)\r\n","funding_links":[],"categories":["C++","🛠️ Tools \u0026 Utilities"],"sub_categories":["🔗 Inference Helpers"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fiwatake2222%2FInferenceHelper_Sample","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fiwatake2222%2FInferenceHelper_Sample","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fiwatake2222%2FInferenceHelper_Sample/lists"}