{"id":15020566,"url":"https://github.com/qengineering/qengineering","last_synced_at":"2026-03-10T15:32:36.322Z","repository":{"id":41809258,"uuid":"330366710","full_name":"Qengineering/Qengineering","owner":"Qengineering","description":"Machine vision apps","archived":false,"fork":false,"pushed_at":"2024-08-08T10:50:58.000Z","size":219,"stargazers_count":32,"open_issues_count":1,"forks_count":8,"subscribers_count":3,"default_branch":"main","last_synced_at":"2024-10-11T02:02:13.558Z","etag":null,"topics":["deep-learning","face-detection","jetson-nano","mnn","ncnn","paddle","raspberry-pi-4","tensorflow"],"latest_commit_sha":null,"homepage":"https://qengineering.eu/","language":null,"has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/Qengineering.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2021-01-17T10:38:52.000Z","updated_at":"2024-08-08T10:51:01.000Z","dependencies_parsed_at":"2023-10-15T08:22:08.578Z","dependency_job_id":"4848f9a3-987e-4d52-9f36-649804627486","html_url":"https://github.com/Qengineering/Qengineering","commit_stats":{"total_commits":150,"total_committers":1,"mean_commits":150.0,"dds":0.0,"last_synced_commit":"14e009803ab6c77fc2da127fce90b5a46f8b31c3"},"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Qengineering%2FQengineering","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Qengineering%2FQengineering/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Qengineering%2FQengineering/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Qengineering%2FQengineering/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/Qengineering","download_url":"https://codeload.github.com/Qengineering/Qengineering/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":219864036,"owners_count":16555943,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["deep-learning","face-detection","jetson-nano","mnn","ncnn","paddle","raspberry-pi-4","tensorflow"],"created_at":"2024-09-24T19:55:16.458Z","updated_at":"2026-03-10T15:32:36.297Z","avatar_url":"https://github.com/Qengineering.png","language":null,"readme":"![output image]( https://qengineering.eu/github/GitHubLogo.jpg )\n\n---------\n\n## **[YoloCam](https://github.com/Qengineering/YoloCam)**\n![output image]( https://qengineering.eu/github/YoloCamGitHubSocialSmall.jpg ) ![output image]( https://qengineering.eu/github/YoloCamGitHubSocialSmallScreen.jpg )\u003cbr\u003e\nA Raspberry Pi 4, 3 or Zero 2, with stand-alone AI object recognition, browser-based live streaming, email, cloud storage, GPIO and URL event triggers.\u003cbr\u003e\n#### **[YoloIP](https://github.com/Qengineering/YoloIP)** \n![output image]( https://qengineering.eu/github/YoloIPGitHubSocialSmall.jpg ) ![output image]( https://qengineering.eu/github/YoloIPGitHubSocialSmallScreen.jpg )\u003cbr\u003e\nA Raspberry Pi 4 or 5, with stand-alone AI, supports multiple IP surveillance cameras.\u003cbr\u003e\n## **[Rock5GPT](https://rock5gpt.qengineering.eu)**\n\u003cimg width=\"543\" height=\"767\" alt=\"Rock5GPT\" src=\"https://github.com/user-attachments/assets/3ce5ad31-bc2b-4513-8ac9-42be793a86db\" /\u003e\u003cbr\u003e\nA professional Qwen3 AI-chatbot running on a Rock 5C.\n\n---------\n\n## Table of Contents\n\n- ### AI\n  * **VLM**\u003cbr\u003e\n    NPU RK3588 (Rock 5A, Orange Pi 5, Rock 5C)\n    + [Qwen3-**Video**](https://github.com/Qengineering/Qwen3-VL-NPU-VIDEO)\n    + [Qwen3-2B](https://github.com/Qengineering/Qwen3-VL-2B-NPU)\n    + [Qwen3-4B](https://github.com/Qengineering/Qwen3-VL-4B-NPU)\n    + [InternVL3.5-1B](https://github.com/Qengineering/InternVL3.5-1B-NPU)\n    + [InternVL3.5-2B](https://github.com/Qengineering/InternVL3.5-2B-NPU)\n    + [InternVL3.5-4B](https://github.com/Qengineering/InternVL3.5-4B-NPU)\n    + [InternVL3.5-8B](https://github.com/Qengineering/InternVL3.5-8B-NPU)\n    + [Qwen2.5-3B](https://github.com/Qengineering/Qwen2.5-VL-3B-NPU)\n    + [Qwen2-7B](https://github.com/Qengineering/Qwen2-VL-7B-NPU)\n    + [Qwen2-2.2B](https://github.com/Qengineering/Qwen2-VL-2B-NPU)\n    + [InternVL3-1B](https://github.com/Qengineering/InternVL3-NPU)\n    + [SmolVLM2-2.2B](https://github.com/Qengineering/SmolVLM2-2B-NPU)\n    + [SmolVLM2-500M](https://github.com/Qengineering/SmolVLM2-500M-NPU)\n    + [SmolVLM2-256M](https://github.com/Qengineering/SmolVLM2-256M-NPU)\n- ### Deep Learning\n  * **Classification**\n    + [TensorFlow Lite Raspberry Pi zero](https://github.com/Qengineering/TensorFlow_Lite_Classification_RPi_zero)\n    + [TensorFlow Lite Raspberry Pi 32](https://github.com/Qengineering/TensorFlow_Lite_Classification_RPi_32-bits)\n    + [TensorFlow Lite Raspberry Pi 64](https://github.com/Qengineering/TensorFlow_Lite_Classification_RPi_64-bits)\n    + [TensorFlow Lite Jetson Nano](https://github.com/Qengineering/TensorFlow_Lite_Classification_Jetson-Nano)\n    + [ncnn ShuffleNetV2 Raspberry Pi](https://github.com/Qengineering/ShuffleNetV2-ncnn)\n    + [ncnn SqueezeNet Raspberry Pi](https://github.com/Qengineering/SqueezeNet-ncnn)\n  * **SSD**\u003cbr\u003e\n      Raspberry Pi 4/5\n    + [TensorFlow Lite Raspberry Pi 32](https://github.com/Qengineering/TensorFlow_Lite_SSD_RPi_32-bits)\n    + [TensorFlow Lite Raspberry Pi 64](https://github.com/Qengineering/TensorFlow_Lite_SSD_RPi_64-bits)\n    + [OpenCV MobileNetV1_SSD Caffe Raspberry Pi 64](https://github.com/Qengineering/MobileNetV1_SSD_OpenCV_Caffe)\n    + [OpenCV MobileNetV1_SSD TensorFlow Raspberry Pi 64](https://github.com/Qengineering/MobileNet_SSD_OpenCV_TensorFlow)\n    + [ncnn Rfcn Raspberry Pi 64](https://github.com/Qengineering/Rfcn_ncnn)\n    + [ncnn Faster RCNN Raspberry Pi 64](https://github.com/Qengineering/Faster_RCNN_ncnn)\n    + [ncnn PeeleeNet Raspberry Pi 64](https://github.com/Qengineering/PeleeNet_SSD)\n    + [ncnn NanoDet Raspberry Pi 64](https://github.com/Qengineering/NanoDet-ncnn-Raspberry-Pi-4)\n    + [ncnn NanoDet Plus Raspberry Pi 64](https://github.com/Qengineering/NanoDetPlus-ncnn-Raspberry-Pi-4)\n    + [ncnn PP-PicoDet Raspberry Pi 64](https://github.com/Qengineering/PP-PicoDet-ncnn-Raspberry-Pi-4)\n    + [ncnn YoloFastestV2 Raspberry Pi 64](https://github.com/Qengineering/YoloFastestV2-ncnn-Raspberry-Pi-4)\n    + [ncnn YoloX Raspberry Pi 64](https://github.com/Qengineering/YoloX-ncnn-Raspberry-Pi-4)\n    + [ncnn YoloV2 Raspberry Pi 64](https://github.com/Qengineering/YoloV2-ncnn-Raspberry-Pi-4)\n    + [ncnn YoloV3 Raspberry Pi 64](https://github.com/Qengineering/YoloV3-ncnn-Raspberry-Pi-4)\n    + [ncnn YoloV4 Raspberry Pi 64](https://github.com/Qengineering/YoloV4-ncnn-Raspberry-Pi-4)\n    + [ncnn YoloV5 Raspberry Pi 64](https://github.com/Qengineering/YoloV5-ncnn-Raspberry-Pi-4)\n    + [ncnn YoloV6 Raspberry Pi 64](https://github.com/Qengineering/YoloV6-ncnn-Raspberry-Pi-4)\n    + [ncnn YoloV7 Raspberry Pi 64](https://github.com/Qengineering/YoloV7-ncnn-Raspberry-Pi-4)\n    + [ncnn YoloV8 Raspberry Pi 64](https://github.com/Qengineering/YoloV8-ncnn-Raspberry-Pi-4)\n    + [ncnn YoloV9 Raspberry Pi 64](https://github.com/Qengineering/YoloV9-ncnn-Raspberry-Pi-4)\n    + [ncnn YoloV10 Raspberry Pi 64](https://github.com/Qengineering/YoloV10-ncnn-Raspberry-Pi-4)\u003cbr\u003e\u003cbr\u003e\n      Jetson Nano\n    + [TensorFlow Lite Jetson Nano](https://github.com/Qengineering/TensorFlow_Lite_SSD_Jetson-Nano)\n    + [Darknet YoloV4 Jetson Nano](https://github.com/Qengineering/YoloV4-Darknet-Jetson-Nano)\n    + [ncnn NanoDet Jetson Nano](https://github.com/Qengineering/NanoDet-ncnn-Jetson-Nano)\n    + [ncnn NanoDet Plus Jetson Nano](https://github.com/Qengineering/NanoDetPlus-ncnn-Jetson-Nano)\n    + [ncnn PP-PicoDet Jetson Nano](https://github.com/Qengineering/PP-PicoDet-ncnn-Jeston-Nano)\n    + [ncnn YoloFastetsV2 Jetson Nano](https://github.com/Qengineering/YoloFastest-ncnn-Jetson-Nano)\n    + [ncnn YoloX Jetson Nano](https://github.com/Qengineering/YoloX-ncnn-Jetson-Nano)\n    + [ncnn YoloV2 Jetson Nano](https://github.com/Qengineering/YoloV2-ncnn-Jetson-Nano)\n    + [ncnn YoloV3 Jetson Nano](https://github.com/Qengineering/YoloV3-ncnn-Jetson-Nano)\n    + [ncnn YoloV4 Jetson Nano](https://github.com/Qengineering/YoloV4-ncnn-Jetson-Nano)\n    + [ncnn YoloV5 Jetson Nano](https://github.com/Qengineering/YoloV5-ncnn-Jetson-Nano)\n    + [ncnn YoloV6 Jetson Nano](https://github.com/Qengineering/YoloV6-ncnn-Jetson-Nano)\n    + [ncnn YoloV7 Jetson Nano](https://github.com/Qengineering/YoloV7-ncnn-Jetson-Nano)\u003cbr\u003e\u003cbr\u003e\n      TensorRT\n    + [TensorRT YoloV8.2 Jetson (Orin) Nano](https://github.com/Qengineering/YoloV8-TensorRT-Jetson_Nano)\u003cbr\u003e\u003cbr\u003e\n      NPU RK3566/68/88 (Radxa Zero 3, Rock 5, Orange Pi 5, Rock 5C)\n    + [NPU PP YoloE](https://github.com/Qengineering/PPYoloE-NPU)\n    + [NPU YoloX](https://github.com/Qengineering/YoloX-NPU)\n    + [NPU YoloV5](https://github.com/Qengineering/YoloV5-NPU)\n    + [NPU YoloV5 multithread](https://github.com/Qengineering/YoloV5-NPU-Multithread)\n    + [NPU YoloV6](https://github.com/Qengineering/YoloV6-NPU)\n    + [NPU YoloV7](https://github.com/Qengineering/YoloV7-NPU)\n    + [NPU YoloV8](https://github.com/Qengineering/YoloV8-NPU)\n    + [NPU YoloV10](https://github.com/Qengineering/YoloV10-NPU)\n  * **Tracking**\n    + [ByteTrack with labels](https://github.com/Qengineering/ByteTrack_with_labels)\n    + [ncnn YoloX + Tracking Rpi 64](https://github.com/Qengineering/YoloX-Tracking-ncnn-RPi_64-bit)\n    + [ncnn NanoDet + Tracking Rpi 64](https://github.com/Qengineering/NanoDet-Tracking-ncnn-RPi_64-bit)\n    + [Tensorflow Lite + Tracking Rpi 64](https://github.com/Qengineering/TensorFlow_Lite-Tracking-RPi_64-bit)\n  * **Traffic**\n    + [Traffic counter camera Rpi 64](https://github.com/Qengineering/Traffic-Counter-RPi_64-bit) \n    + [Traffic counter camera Rock 5C](https://github.com/Qengineering/Traffic-Counter-Rock5C) \n  * **Segmentation**\n    + [TensorFlow Lite Raspberry Pi 32](https://github.com/Qengineering/TensorFlow_Lite_Segmentation_RPi_32-bit)\n    + [TensorFlow Lite Raspberry Pi 64](https://github.com/Qengineering/TensorFlow_Lite_Segmentation_RPi_64-bit)\n    + [TensorFlow Lite Jetson Nano](https://github.com/Qengineering/TensorFlow_Lite_Segmentation_Jetson-Nano)\n    + [ncnn YoloV5 segmentation Rpi 64](https://github.com/Qengineering/YoloV5-segmentation-ncnn-RPi4)\n    + [ncnn Yolact Raspberry Pi](https://github.com/Qengineering/Yolact-ncnn-Raspberry-Pi-4)\n    + [NPU YoloV5-seg](https://github.com/Qengineering/YoloV5-seg-NPU)    \n    + [NPU YoloV8-seg](https://github.com/Qengineering/YoloV8-seg-NPU)    \n  * **Pose**\n    + [TensorFlow Lite Raspberry Pi 32](https://github.com/Qengineering/TensorFlow_Lite_Pose_RPi_32-bits)\n    + [TensorFlow Lite Raspberry Pi 64](https://github.com/Qengineering/TensorFlow_Lite_Pose_RPi_64-bits)\n    + [TensorFlow Lite Jetson Nano](https://github.com/Qengineering/TensorFlow_Lite_Pose_Jetson-Nano)\n    + [ncnn Raspberry Pi 64](https://github.com/Qengineering/ncnn_Pose_RPi_64-bits)   \n  * **Face detection**\n    + [MNN Ultra Raspberry Pi 64](https://github.com/Qengineering/Face-detection-Raspberry-Pi-32-64-bits/tree/master/MNN)\n    + [ncnn Ultra Raspberry Pi 64](https://github.com/Qengineering/Face-detection-Raspberry-Pi-32-64-bits/tree/master/ncnn)\n    + [OpenCV Ultra Raspberry Pi 64](https://github.com/Qengineering/Face-detection-Raspberry-Pi-32-64-bits/tree/master/OpenCV)\n    + [ncnn LFFD Raspberry Pi 64](https://github.com/Qengineering/LFFD-ncnn-Raspberry-Pi-4)\n    + [MNN LFFD Raspberry Pi 64](https://github.com/Qengineering/LFFD-MNN-Raspberry-Pi-4)\n    + [ncnn CenterFace Raspberry Pi 64](https://github.com/Qengineering/CenterFace-ncnn-Raspberry-Pi-4)\n    + [ncnn LFFD Jetson Nano](https://github.com/Qengineering/LFFD-ncnn-Jetson-Nano)\n    + [MNN LFFD Jetson Nano](https://github.com/Qengineering/LFFD-MNN-Jetson-Nano)\n    + [ncnn CenterFace Jetson Nano](https://github.com/Qengineering/CenterFace-ncnn-Jetson-Nano)\n    + [ncnn YoloV5 face Raspberry Pi 64](https://github.com/Qengineering/YoloV5-face-ncnn-RPi4)\n    * Face detection with landmarks\n      + [ncnn Ultra Raspberry Pi](https://github.com/Qengineering/Face-detection-Landmark-Raspberry-Pi-32-64-bits)\n  * **Face mask detection**\n    + [ncnn + Paddle Raspberry Pi](https://github.com/Qengineering/Face-Mask-Detection-Raspberry-Pi-64-bits)\n    + [ncnn + Paddle Jetson Nano](https://github.com/Qengineering/Face-Mask-Detection-Jetson-Nano)\n    + [TensorFlow Raspberry Pi](https://github.com/Qengineering/TensorFlow_Lite_Face_Mask_RPi_64-bits)\n    + [TensorFlow Jetson Nano](https://github.com/Qengineering/TensorFlow_Lite_Face_Mask_Jetson-Nano)\n  * **Face recognition**\n    + [ncnn Raspberry Pi](https://github.com/Qengineering/Face-Recognition-Raspberry-Pi-64-bits)\n    + [ncnn Jetson Nano](https://github.com/Qengineering/Face-Recognition-Jetson-Nano)\n    * Face recognition with mask\n      + [ncnn Jetson Nano](https://github.com/Qengineering/Face-Recognition-with-Mask-Jetson-Nano)\n  * **OCR**\n    + [PaddleOCR-Lite License plate RPi](https://github.com/Qengineering/PaddleOCR-Lite-License)\n    + [PaddleOCR-Lite Document scanner RPi](https://github.com/Qengineering/PaddleOCR-Lite-Document)\n    + [OpenCV detect text in image RPi](https://github.com/Qengineering/OpenCV_OCR_Detect_Text)\n    + [OpenCV recognize text with deep learning RPi](https://github.com/Qengineering/OpenCV_OCR_DNN)\n    + [OpenCV recognize text with tesseract RPi](https://github.com/Qengineering/OpenCV_OCR_Tesseract)\n  * **Parking** \n    + [XACTAI - OCR - License plate](https://github.com/xactai/ALPR_1.5_Public/tree/master/ALPR_1.5)\n  * **Super-resolution**\n    + [real ESRGAN ncnn Raspberry Pi 4 ](https://github.com/Qengineering/Real-ESRGAN-ncnn-Raspberry-Pi-4)\n    + [realsr ncnn Jetson Nano](https://github.com/Qengineering/realsr-ncnn-Jetson-Nano)\n  * **Face reconstruction**\n    + [GFPGAN ncnn Raspberry Pi 4 ](https://github.com/Qengineering/GFPGAN-ncnn-Raspberry-Pi-4)\n  * **Age Gender estimation**\n    + [OpenCV Raspberry Pi](https://github.com/Qengineering/Age-Gender-OpenCV-Raspberry-Pi-4)\n  * **Head pose estimation**\n    + [ncnn Ultra Raspberry Pi 64](https://github.com/Qengineering/Head-Pose-ncnn-Raspberry-Pi-4)\n  * **Hand pose estimation**\n    + [ncnn NanoDet Raspberry Pi 64](https://github.com/Qengineering/Hand-Pose-ncnn-Raspberry-Pi-4)\n  * **Colorization**\n    + [ncnn Colorization Raspberry Pi 64](https://github.com/Qengineering/ncnn-Colorization_Raspberry-Pi-4)\n  * **QR and bar code**\n    + [ZBar Raspberry Pi](https://github.com/Qengineering/QR_scanner_Raspberry_Pi)\n- ### Wheels\n  * **TensorFlow**\n    + [Raspberry Pi 32](https://github.com/Qengineering/TensorFlow-Raspberry-Pi)\n    + [Raspberry Pi 64](https://github.com/Qengineering/TensorFlow-Raspberry-Pi_64-bit)\n    + [Jetson Nano](https://github.com/Qengineering/TensorFlow-JetsonNano)\n  * **TensorFlow Lite**\n    + [Raspberry Pi 64](https://github.com/Qengineering/TensorFlow-Lite-Raspberry-Pi_64-bit)\n    + [Jetson Nano](https://github.com/Qengineering/TensorFlow-Lite-Raspberry-Pi_64-bit)\n  * **TensorFlow Addons**\n    + [Raspberry Pi 64](https://github.com/Qengineering/TensorFlow-Addons-Raspberry-Pi_64-bit)\n    + [Jetson Nano](https://github.com/Qengineering/TensorFlow-Addons-Jetson-Nano)\n  * **PyTorch**\n    + [Raspberry Pi 64](https://github.com/Qengineering/PyTorch-Raspberry-Pi-64-OS)\n    + [Jetson Nano](https://github.com/Qengineering/PyTorch-Jetson-Nano)\n  * **PaddlePaddle**\n    + [Raspberry Pi 64](https://github.com/Qengineering/Paddle-Raspberry-Pi)\n    + [Jetson Nano](https://github.com/Qengineering/Paddle-Jetson-Nano)\n  * **TensorRT**\n    + [Jetson Nano Ubuntu 20.04](https://github.com/Qengineering/Jetson-Nano-Ubuntu-20-image)\n  * **OpenCV**\n    + [Raspberry Pi 32](https://github.com/Qengineering/Install-OpenCV-Raspberry-Pi-32-bits)\n    + [Raspberry Pi 64](https://github.com/Qengineering/Install-OpenCV-Raspberry-Pi-64-bits)  \n    + [Jetson Nano](https://github.com/Qengineering/Install-OpenCV-Jetson-Nano)\n- ### Miscellaneous\n  * **rtop**\n    + [rtop Ubuntu](https://github.com/Qengineering/rtop-Ubuntu)\n    + [rtop KDE](https://github.com/Qengineering/rtop-KDE) \n  * OpenCV\n    + [libcamera C++ API wrapper Bullseye 64](https://github.com/Qengineering/LCCV)\n    * GStreamer\n      * Buster\n        + [GStreamer 1.18.4 + OpenCV Raspberry Pi 32](https://github.com/Qengineering/GStreamer-1.18.4-RPi_32-bits)\n        + [GStreamer 1.18.4 + OpenCV Raspberry Pi 64](https://github.com/Qengineering/GStreamer-1.18.4-RPi_64-bits)\n      * Bullseye \n        + [Libcamera + OpenCV on Raspberry Pi 32](https://github.com/Qengineering/Libcamera-OpenCV-RPi-Bullseye-32OS)\n        + [Libcamera + OpenCV on Raspberry Pi 64](https://github.com/Qengineering/Libcamera-OpenCV-RPi-Bullseye-64OS)\n    + [Multithread cameras with OpenCV](https://github.com/Qengineering/Multithread-Camera-OpenCV)\n    + [RTSP with OpenCV](https://github.com/Qengineering/RTSP-with-OpenCV)\n    + [Examples Raspberry Pi 64](https://github.com/Qengineering/OpenCV-Livecam-Raspberry-Pi)\n    + [Qt5 RPi 64 + Jetson Nano](https://github.com/Qengineering/Qt5-OpenCV-Raspberry-Pi-Jetson-Nano)\n    + [Blur detection](https://github.com/Qengineering/Blur-detection-with-FFT-in-C)\n    + [Fast background substraction](https://github.com/Qengineering/Fast-Background-Substraction)\n  * Sensors\n    + [DHT22 sensor](https://github.com/Qengineering/DHT22-Raspberry-Pi)\n  * Caffe\n    + [cuDNN 8.0 + CUDA 10.3 WeiLui SSD Fork + Configuration files RPi, Jetson](https://github.com/Qengineering/caffe)\n  * NPU\n    + [RKNN model zoo](https://github.com/Qengineering/rknn_model_zoo)\n- ### Images\n![output image]( https://qengineering.eu/github/SDcard32GB_small.jpg ) [Raspberry Pi 4 **Bullseye** 64-bit OS with several frameworks and deep-learning examples](https://github.com/Qengineering/RPi-Bullseye-DNN-image)\u003cbr/\u003e\u003cbr/\u003e\n![output image]( https://qengineering.eu/github/SDcard16GB_small.jpg ) [Raspberry Pi 4 Buster 64-bit OS with several frameworks and deep-learning examples](https://github.com/Qengineering/RPi-image)\u003cbr/\u003e\u003cbr/\u003e\n![output image]( https://qengineering.eu/github/SDcard16GBZero2small.jpg ) [Raspberry Pi **Zero 2 W 64-bit** OS image with OpenCV, TensorFlow Lite and ncnn](https://github.com/Qengineering/RPi_64-bit_Zero-2-image)\u003cbr/\u003e\u003cbr/\u003e\n![output image]( https://qengineering.eu/github/SDcard16GB_banana.jpg ) [Banana Pi M2 Zero image **with OV5640** camera and OpenCV](https://github.com/Qengineering/BananaPi-M2-Zero-OV5640)\u003cbr/\u003e\u003cbr/\u003e\n![output image]( https://qengineering.eu/github/SDcard32GB_RockPi5.jpg ) [Rock 5 with OpenCV, TNN, ncnn and **NPU**](https://github.com/Qengineering/Rock-5-image)\u003cbr\u003e\u003cbr\u003e\n![output image]( https://qengineering.eu/github/RockPi5_Ubuntu_22.jpg ) [Rock 5 with **Ubuntu 22.04**, OpenCV, ncnn and **NPU**](https://github.com/Qengineering/Rock-5-Ubuntu-22-image)\u003cbr\u003e\u003cbr\u003e\n![output image]( https://qengineering.eu/github/RadxaZero3_Ubuntu_22.jpg ) [Radxa Zero 3 with **Ubuntu 24.04**, OpenCV, ncnn and **NPU**](https://github.com/Qengineering/Radxa-Zero-3-NPU-Ubuntu24)\u003cbr\u003e\u003cbr\u003e\n![output image]( https://qengineering.eu/github/SDcard32GB_smallJetson.jpg ) [A Jetson Nano image with OpenCV, TensorFlow and PyTorch](https://github.com/Qengineering/Jetson-Nano-image)\u003cbr/\u003e\u003cbr/\u003e\n![output image]( https://qengineering.eu/github/SDcard32GBJetsonUB20small.jpg ) [A Jetson Nano - **Ubuntu 20.04** image with OpenCV, TensorFlow and PyTorch](https://github.com/Qengineering/Jetson-Nano-Ubuntu-20-image)\u003cbr/\u003e\u003cbr/\u003e\n- ### Applications\n![output image]( https://qengineering.eu/github/SDcardMotion.jpg ) [RPi z2, 3 or 4 motion surveillance camera with email notification and gdrive storage](https://github.com/Qengineering/RPiMotionCam)\u003cbr/\u003e\u003cbr/\u003e\n![output image]( https://qengineering.eu/github/SDcardDNN.jpg ) [**YoloCam**, the cheapest AI-powered camera with email notification, gdrive storage and GPIO output](https://github.com/Qengineering/YoloCam)\u003cbr/\u003e\u003cbr/\u003e\n![output image]( https://qengineering.eu/github/SDcardYoloIP.jpg ) [**YoloIP**, the cheapest AI-powered machine, supports multiple IP surveillance cameras](https://github.com/Qengineering/YoloIP)\u003cbr/\u003e\u003cbr/\u003e\n\n![statistics](https://github-readme-stats-auj21bfum-qengineerings-projects.vercel.app/api?username=Qengineering\u0026count_private=true\u0026show_icon=true\u0026card_width=400\u0026bg_color=00000000\u0026title_color=005E2C\u0026text_color=949CA5\u0026show_icons=true\u0026hide_border=true\u0026icon_color=00BE33)\u003cbr\u003e\n\n\u003c!--\n![stats](https://github-readme-stats-r4ppv5gun-qengineerings-projects.vercel.app/api?username=Qengineering\u0026count_private=true\u0026show_icon=true\u0026card_width=400\u0026bg_color=00000000\u0026title_color=005E2C\u0026text_color=949CA5\u0026show_icons=true\u0026hide_border=true\u0026icon_color=00BE33\u0026cachebust=20260107)\u003cbr\u003e\n--\u003e\n\u003c!--\n![Stats](https://github-readme-stats.vercel.app/api?username=Qengineering\u0026count_private=true\u0026show_icon=true\u0026card_width=400\u0026bg_color=00000000\u0026title_color=005E2C\u0026text_color=949CA5\u0026show_icons=true\u0026hide_border=true\u0026icon_color=00BE33)\n--\u003e\n\n\u003c!--\n\u003cp align=\"left\"\u003e \n  Visitor count\u003cbr\u003e\n  \u003cimg src=\"https://profile-counter.glitch.me/Qengineering/count.svg\" /\u003e\n\u003c/p\u003e\n\u003cbr\u003e\n--\u003e\n[![paypal](https://qengineering.eu/github/TipJarSmall4.png)](https://www.paypal.com/cgi-bin/webscr?cmd=_s-xclick\u0026hosted_button_id=CPZTM5BB3FCYL) \n\n\u003c!--\n**Qengineering/Qengineering** is a ✨ _special_ ✨ repository because its `README.md` (this file) appears on your GitHub profile.\n\nHere are some ideas to get you started:\n\n- 🔭 I’m currently working on ...\n- 🌱 I’m currently learning ...\n- 👯 I’m looking to collaborate on ...\n- 🤔 I’m looking for help with ...\n- 💬 Ask me about ...\n- 📫 How to reach me: ...\n- 😄 Pronouns: ...\n- ⚡ Fun fact: ...\n--\u003e\n","funding_links":["https://www.paypal.com/cgi-bin/webscr?cmd=_s-xclick\u0026hosted_button_id=CPZTM5BB3FCYL"],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fqengineering%2Fqengineering","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fqengineering%2Fqengineering","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fqengineering%2Fqengineering/lists"}