Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/hanleyweng/gesture-recognition-101-coreml-arkit
Simple project to recognize hands in realtime. π Serves as an Example for building your own object recognizer.
https://github.com/hanleyweng/gesture-recognition-101-coreml-arkit
arkit augmented-reality computer-vision coreml gesture-recognition hand-tracking ios machine-learning object-recognition
Last synced: 6 days ago
JSON representation
Simple project to recognize hands in realtime. π Serves as an Example for building your own object recognizer.
- Host: GitHub
- URL: https://github.com/hanleyweng/gesture-recognition-101-coreml-arkit
- Owner: hanleyweng
- License: mit
- Created: 2017-10-22T18:15:34.000Z (about 7 years ago)
- Default Branch: master
- Last Pushed: 2017-12-11T00:18:34.000Z (about 7 years ago)
- Last Synced: 2025-01-08T16:18:53.433Z (13 days ago)
- Topics: arkit, augmented-reality, computer-vision, coreml, gesture-recognition, hand-tracking, ios, machine-learning, object-recognition
- Language: Swift
- Homepage: https://medium.com/p/7f8c09b461a1
- Size: 12.6 MB
- Stars: 343
- Watchers: 11
- Forks: 49
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Hand Gesture Recognition
This simple sample project recognizes hands in realtime. π It serves as a basic example for recognizing your own objects. Suitable for AR π€. Written for the tutorial [βCreate your own Object Recognizerβ](https://medium.com/p/7f8c09b461a1).![gif showing fist and spread hand appearing and dissappearing from screen, and it being recognized on an iPhone](post-media/giphy.gif)
[Demo Video - on Youtube](https://youtu.be/P3Q8awgT9Lk)
Tech: iOS 11, ARKit, CoreML, iPhone 7 plus, Xcode 9.1, Swift 4.0
## Notes:
This demonstrates basic Object Recognition (for spread hand π, fist π, and no hands β). It serves as a building block for object detection, localization, gesture-recognition, and hand tracking.
Disclaimer:
The sample model provided here was captured in 1 hour and is biased to one human handΒ ππΌ. Itβs intended as a placeholder for your own models. (See [Tutorial](https://medium.com/p/7f8c09b461a1))
---
## Steps Taken (Overview)_Hereβs an overview of the steps taken. (You can also view my commit history to see steps involved.)_
1. Build an Intuition by playing with Google CL's [Teachable Machine](https://teachablemachine.withgoogle.com/).
2. Build dataset.
3. Create a Core ML Model using Microsoft's [CustomVision.ai](https://www.customvision.ai/).
4. Run the model in realtime with ARKit.[Full Tutorial here](https://medium.com/p/7f8c09b461a1)
_P.S. A few well selected images are sufficient for CustomVision.ai . For the sample model here, I did 3 rounds of data collection (adding 63, 38, 21 images per round). Alternating classes during data collection also appeared to work better than gathering all the class images at once._
![image of dataset](post-media/image-gather-screen.png)
## License
MIT Open Source License. π§ Use as you wish. Have fun! π