https://github.com/machinelearningprodigy/volume_control_gesture
https://github.com/machinelearningprodigy/volume_control_gesture
Last synced: 6 months ago
JSON representation
- Host: GitHub
- URL: https://github.com/machinelearningprodigy/volume_control_gesture
- Owner: machinelearningprodigy
- Created: 2024-07-17T14:15:54.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2025-02-16T10:05:11.000Z (12 months ago)
- Last Synced: 2025-02-16T11:18:30.199Z (12 months ago)
- Language: Python
- Size: 23.4 KB
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
ποΈ Volume Control Gesture π―
A modern and interactive application that allows users to control the system volume using simple hand gestures. ποΈποΈ Built using computer vision techniques with OpenCV and Python, this project leverages hand-tracking to create a seamless and intuitive volume control experienceβno more reaching for the volume buttons! π
---
π Features
π₯ Real-Time Hand Tracking: Track hand movements in real-time using OpenCV.
ποΈ Gesture-Based Volume Control: Adjust volume by moving your hand closer or farther.
π Dynamic Volume Feedback: Displays the current volume level on the screen.
βοΈ Easy Integration: Simple and lightweight with minimal dependencies.
ποΈ User-Friendly Interface: Clear and intuitive UI for easy operation.
---
π οΈ Tech Stack
π Python
π― OpenCV
π€ MediaPipe (for hand tracking)
π Pycaw (for volume control)
---
π Installation
1. Clone the repository:
git clone https://github.com/machinelearningprodigy/Volume_Control_Gesture.git
cd Volume_Control_Gesture
2. Install dependencies:
pip install opencv-python mediapipe pycaw numpy
---
βΆοΈ Usage
1. Run the application:
python volume_control.py
2. How it works:
Show your hand ποΈ in front of the camera.
Adjust volume by moving your thumb and index finger:
Bring fingers closer β π Decrease volume.
Move fingers apart β π Increase volume.
Current volume level is displayed in real-time.
---
βοΈ Configuration
You can modify the settings in the code:
Camera Index: Adjust cv2.VideoCapture(0) for external cameras.
Volume Range: Customize the volume limits.
---
πΈ Demo
https://github.com/machinelearningprodigy/Volume_Control_Gesture/assets/demo.mp4
---
π§ How It Works
1. Hand Detection: Uses MediaPipe to detect and track hands in real-time.
2. Distance Measurement: Calculates the distance between thumb and index finger.
3. Volume Adjustment: Maps the distance to system volume using Pycaw.
4. Feedback Display: Shows the volume level dynamically on the video feed.
---
π‘ Troubleshooting
Ensure the camera is working properly. π₯
Run the script with Admin permissions if the volume control doesn't work.
Update dependencies if the hand-tracking is inaccurate.
---
π Future Improvements
π€ Enhanced Gesture Controls: Add play/pause functionality.
π Voice Feedback: Provide audio cues for volume changes.
π§ AI Integration: Improve hand tracking with AI-based enhancements.
---
π€ Contributing
Contributions are welcome! π±
1. Fork the repo π΄
2. Create a new branch πΏ
3. Make your changes π‘
4. Submit a pull request π
---
π‘οΈ License
βοΈ Licensed under the MIT License.
---
π― Acknowledgments
Special thanks to the developers of OpenCV, MediaPipe, and Pycaw for their amazing libraries. π
---
π Let's make interaction with computers more intuitive! ππ