{"id":22807943,"url":"https://github.com/prodev717/gesturecall","last_synced_at":"2026-04-08T18:02:17.001Z","repository":{"id":267151741,"uuid":"900398426","full_name":"prodev717/GestureCall","owner":"prodev717","description":"A video call intercom system designed for deaf people, using a Raspberry Pi, vibration motor, and AI-powered sign language translation. It enables communication between normal and deaf users over a local network without any call charges.","archived":false,"fork":false,"pushed_at":"2024-12-08T17:53:15.000Z","size":1728,"stargazers_count":1,"open_issues_count":0,"forks_count":1,"subscribers_count":1,"default_branch":"main","last_synced_at":"2025-09-13T16:45:07.846Z","etag":null,"topics":["ai","deaf-communications","fastapi","iot","local-network","mediapipe","opencv","python","python-gpio","raspberry-pi","sign-language-translation","socket-programming","vibration-sensor","video-call"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/prodev717.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2024-12-08T17:15:15.000Z","updated_at":"2025-01-03T02:02:41.000Z","dependencies_parsed_at":"2024-12-08T18:32:03.876Z","dependency_job_id":null,"html_url":"https://github.com/prodev717/GestureCall","commit_stats":null,"previous_names":["prodev717/gesturecall"],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/prodev717/GestureCall","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/prodev717%2FGestureCall","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/prodev717%2FGestureCall/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/prodev717%2FGestureCall/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/prodev717%2FGestureCall/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/prodev717","download_url":"https://codeload.github.com/prodev717/GestureCall/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/prodev717%2FGestureCall/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":31567227,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-04-08T14:31:17.711Z","status":"ssl_error","status_checked_at":"2026-04-08T14:31:17.202Z","response_time":54,"last_error":"SSL_connect returned=1 errno=0 peeraddr=140.82.121.5:443 state=error: unexpected eof while reading","robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":false,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["ai","deaf-communications","fastapi","iot","local-network","mediapipe","opencv","python","python-gpio","raspberry-pi","sign-language-translation","socket-programming","vibration-sensor","video-call"],"created_at":"2024-12-12T11:07:17.253Z","updated_at":"2026-04-08T18:02:16.982Z","avatar_url":"https://github.com/prodev717.png","language":"Python","readme":"# GestureCall\n\n# Video Call Intercom Based on IP System with Vibration Sensor\n\n## Project Overview\n\nThis project is a **hardware-software integrated solution** designed to facilitate communication between **deaf individuals and normal users**. It was developed as part of the **Engineering Clinics Course (ECS)** at **VIT-AP University** and addresses a **Smart India Hackathon (SIH)** problem statement.\n\nThe system enables **video calls** over a local network with **zero communication costs**, using a combination of hardware (Raspberry Pi) and Python-based software. It also features **sign language translation** and **speech-to-text functionality**, providing a seamless and inclusive communication platform.\n\n---\n\n## Features\n\n### 1. **Video Call Functionality**\n- Operates over a **local network (eth0/wlan)**.\n- Devices communicate using static IPs through Python's socket library.\n- Supports video calls between:\n  - Devices designed for deaf individuals.\n  - Regular desktops or other devices.\n\n### 2. **Sign Language Translation**\n- **Dataset and Mapping:**\n  - Static gestures corresponding to 24 commonly used words (mapped to important phrases).\n  - Each gesture represented by angles between hand landmarks, captured using Mediapipe.\n  - All angles saved in a **Pickle file** for efficient retrieval.\n- **Translation Process:**\n  - Mediapipe processes real-time hand landmarks.\n  - Angles are compared with the pre-trained dataset to identify gestures.\n  - The corresponding word is displayed on the user interface.\n\n### 3. **Speech-to-Text Conversion**\n- Converts spoken words into text for better understanding, displayed on the UI.\n\n### 4. **Hardware Integration**\n- **Raspberry Pi 4** with:\n  - XPT2046 5-inch touchscreen.\n  - Camera module for video capture.\n  - Vibration motor for incoming call alerts (triggered via GPIO).\n\n### 5. **UI Design**\n- Simple interface built with Tkinter.\n- Displays a list of available devices in the network, retrieved from the server.\n\n---\n\n## How It Works\n\n1. **Server Setup:**\n   - A **FastAPI server** provides the list of connected devices, their names, and static IPs.\n\n2. **Communication:**\n   - Devices communicate using **socket programming**, exchanging video frames and other data.\n\n3. **Gesture Recognition:**\n   - Mediapipe detects hand landmarks.\n   - Captured angles are compared to the **pre-trained dataset** stored in a Pickle file.\n   - Recognized gestures are mapped to specific words and displayed on the screen.\n\n4. **User Interaction:**\n   - Devices display the list of connected devices on the UI.\n   - Users select a device to initiate a video call.\n\n5. **Alerts:**\n   - Incoming calls trigger the **vibration motor**, notifying deaf users.\n\n---\n\n## Hardware Requirements\n\n- **Raspberry Pi 4**\n- **XPT2046 5-inch touchscreen**\n- **Camera module**\n- **Vibration motor**\n- **Local network setup (Ethernet or WiFi)**\n\n---\n\n## Software Stack\n\n- **Programming Language:** Python\n- **Libraries and Tools:**\n  - [FastAPI](https://fastapi.tiangolo.com/) (Server for managing devices in the network)\n  - [OpenCV](https://opencv.org/) (Camera access for desktop)\n  - [Mediapipe](https://mediapipe.dev/) (Hand gesture recognition)\n  - [Tkinter](https://docs.python.org/3/library/tkinter.html) (UI for user interaction)\n  - [Socket](https://docs.python.org/3/library/socket.html) (Local network communication)\n  - GPIO (Vibration motor control for alerts)\n\n---\n## Images\n\n![Demo](demo.png)\n\n---\n\n## Acknowledgments\n\nThis project was developed as part of the **Engineering Clinics Course (ECS)** at **VIT-AP University** and addresses a **Smart India Hackathon (SIH)** problem statement.\n\n---\n\n## License\n\nThis project is licensed under the MIT License. See the [LICENSE](LICENSE) file for details.\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fprodev717%2Fgesturecall","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fprodev717%2Fgesturecall","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fprodev717%2Fgesturecall/lists"}