An open API service indexing awesome lists of open source software.

https://github.com/machinelearningprodigy/volume_control_gesture


https://github.com/machinelearningprodigy/volume_control_gesture

Last synced: 6 months ago
JSON representation

Awesome Lists containing this project

README

          

πŸŽ›οΈ Volume Control Gesture 🎯

A modern and interactive application that allows users to control the system volume using simple hand gestures. πŸ–οΈπŸŽšοΈ Built using computer vision techniques with OpenCV and Python, this project leverages hand-tracking to create a seamless and intuitive volume control experienceβ€”no more reaching for the volume buttons! πŸš€

---

🌟 Features

πŸŽ₯ Real-Time Hand Tracking: Track hand movements in real-time using OpenCV.

πŸ–οΈ Gesture-Based Volume Control: Adjust volume by moving your hand closer or farther.

πŸ”Š Dynamic Volume Feedback: Displays the current volume level on the screen.

βš™οΈ Easy Integration: Simple and lightweight with minimal dependencies.

πŸŽ›οΈ User-Friendly Interface: Clear and intuitive UI for easy operation.

---

πŸ› οΈ Tech Stack

🐍 Python

🎯 OpenCV

πŸ€– MediaPipe (for hand tracking)

πŸ”Š Pycaw (for volume control)

---

πŸš€ Installation

1. Clone the repository:

git clone https://github.com/machinelearningprodigy/Volume_Control_Gesture.git
cd Volume_Control_Gesture

2. Install dependencies:

pip install opencv-python mediapipe pycaw numpy

---

▢️ Usage

1. Run the application:

python volume_control.py

2. How it works:

Show your hand πŸ–οΈ in front of the camera.

Adjust volume by moving your thumb and index finger:

Bring fingers closer β†’ πŸ”‰ Decrease volume.

Move fingers apart β†’ πŸ”Š Increase volume.

Current volume level is displayed in real-time.

---

βš™οΈ Configuration

You can modify the settings in the code:

Camera Index: Adjust cv2.VideoCapture(0) for external cameras.

Volume Range: Customize the volume limits.

---

πŸ“Έ Demo

https://github.com/machinelearningprodigy/Volume_Control_Gesture/assets/demo.mp4

---

🧠 How It Works

1. Hand Detection: Uses MediaPipe to detect and track hands in real-time.

2. Distance Measurement: Calculates the distance between thumb and index finger.

3. Volume Adjustment: Maps the distance to system volume using Pycaw.

4. Feedback Display: Shows the volume level dynamically on the video feed.

---

πŸ’‘ Troubleshooting

Ensure the camera is working properly. πŸŽ₯

Run the script with Admin permissions if the volume control doesn't work.

Update dependencies if the hand-tracking is inaccurate.

---

πŸš€ Future Improvements

πŸ€– Enhanced Gesture Controls: Add play/pause functionality.

πŸ”Š Voice Feedback: Provide audio cues for volume changes.

🧠 AI Integration: Improve hand tracking with AI-based enhancements.

---

🀝 Contributing

Contributions are welcome! 🌱

1. Fork the repo 🍴

2. Create a new branch 🌿

3. Make your changes πŸ’‘

4. Submit a pull request πŸ””

---

πŸ›‘οΈ License

βš–οΈ Licensed under the MIT License.

---

🎯 Acknowledgments

Special thanks to the developers of OpenCV, MediaPipe, and Pycaw for their amazing libraries. πŸ™Œ

---

πŸ” Let's make interaction with computers more intuitive! πŸŒπŸš€