https://github.com/nomi30701/sign-language-recognition-web-mediapie
Hand Gesture Recognition with MediaPipe browser, support webGL wasm(cpu), Webcam support for live detection. Rock-Paper-Scissors, Sign Language gesture recognition
https://github.com/nomi30701/sign-language-recognition-web-mediapie
browser browsers gh-pages github-pages mediapipe mediapipe-hands react sign-language-recognition wasm webapplication webgl
Last synced: 7 months ago
JSON representation
Hand Gesture Recognition with MediaPipe browser, support webGL wasm(cpu), Webcam support for live detection. Rock-Paper-Scissors, Sign Language gesture recognition
- Host: GitHub
- URL: https://github.com/nomi30701/sign-language-recognition-web-mediapie
- Owner: nomi30701
- License: mit
- Created: 2025-03-15T14:39:26.000Z (7 months ago)
- Default Branch: main
- Last Pushed: 2025-03-28T05:50:12.000Z (7 months ago)
- Last Synced: 2025-03-28T06:26:25.595Z (7 months ago)
- Topics: browser, browsers, gh-pages, github-pages, mediapipe, mediapipe-hands, react, sign-language-recognition, wasm, webapplication, webgl
- Language: JavaScript
- Homepage: https://nomi30701.github.io/hand-gesture-recognition-web/
- Size: 10.2 MB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# ✋ Sing Language recognition with MediaPipe browser
![]()
## 📝 Description
This browser-based hand gesture recognition application uses MediaPipe to detect and classify hand gestures in real-time. No installation required - everything runs directly in your web browser! The application includes three specialized models for different use cases:
1. General Gestures: MediaPipe's default gesture recognition model
2. Rock-Paper-Scissors: Custom model trained with MediaPipe Model Maker
3. Sign Language: Custom model trained with MediaPipe Model Maker for sign language detection## 📊 Data Sources
- Sign Language Model: Trained on the [Sign Language MNIST dataset](https://www.kaggle.com/datasets/datamunge/sign-language-mnist/data), which contains numerous labeled images of American Sign Language letters
- Rock-Paper-Scissors Model: Developed and trained following the [official MediaPipe tutorial](https://ai.google.dev/edge/mediapipe/solutions/customization/gesture_recognizer)## 🤟 Sign Language Reference Chart
![]()
## ✨ Features
- 🖐️ Real-time hand gesture recognition
- 📷 Webcam support for live detection
- 🖼️ Image upload for static detection
- 🔄 Multiple specialized models to choose from:
- ✌️ Rock-Paper-Scissors recognition
- 👋 Sign language interpretation
- 👆 General hand gesture detection
- 🚀 WebGL (GPU) acceleration for faster processing
- 💻 Wasm (CPU) support for wider device compatibility
- 📊 Detailed detection results with confidence scores
- 📱 Responsive design for mobile and desktop devices## 🛠️ Technology Stack
- ⚛️ React.js - UI framework
- 📱 MediaPipe - Google's ML solution for vision tasks
- 🔧 MediaPipe Model Maker - For custom model training
- 🎨 TailwindCSS - Styling## 🔧 Installation & Setup
```bash
# Clone the repository
git clone https://github.com/yourusername/hand-detection.git# Navigate to project directory
cd hand-detection# Install dependencies
npm install# Start development server
npm run dev
```