https://github.com/alihassanml/lums-ai-hackthon
LUMS AI Hackthon
https://github.com/alihassanml/lums-ai-hackthon
deep-learning gru lsm magic opencv tensorflow
Last synced: 7 months ago
JSON representation
LUMS AI Hackthon
- Host: GitHub
- URL: https://github.com/alihassanml/lums-ai-hackthon
- Owner: alihassanml
- License: mit
- Created: 2024-12-31T08:44:15.000Z (9 months ago)
- Default Branch: main
- Last Pushed: 2025-01-01T20:13:23.000Z (9 months ago)
- Last Synced: 2025-01-17T19:16:55.250Z (9 months ago)
- Topics: deep-learning, gru, lsm, magic, opencv, tensorflow
- Language: Jupyter Notebook
- Homepage:
- Size: 24.1 MB
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# LUMS AI Hackathon: Magic Wand Project 🪄
Welcome to the **Magic Wand Project**, developed as part of the **LUMS AI Hackathon**! 🚀 Our innovative solution bridges the gap between gesture recognition and real-time AI-powered interaction, delivering a unique and engaging user experience.
---
## 🎯 Project Overview
The Magic Wand project leverages **Streamlit** and **Groq API** to create an intuitive and interactive application for gesture-based controls. This AI-driven system can detect and classify hand gestures in real time, enabling seamless integration with various applications.
### Key Highlights:
- Utilized **LSTMs** and **GRUs** for handling sequential data and long-term dependencies.
- Achieved **89% accuracy** on Kaggle, securing **3rd place** on the leaderboard.
- Designed with an easy-to-use interface powered by **Streamlit** for rapid deployment.
- Integrated **Groq API** for high-performance model inference.---
## 🚀 Features
- **Real-Time Gesture Recognition**: Classify hand gestures with precision.
- **Streamlined UI**: User-friendly interface for interaction and visualization.
- **Robust Model**: Trained on 100-frame sequences using advanced deep learning techniques.
- **Scalable Design**: Built for adaptability across various use cases.---
## 🛠️ Tech Stack
- **Frontend**: [Streamlit](https://streamlit.io/) for a responsive and interactive UI.
- **Backend**: Python with integration of the **Groq API** for high-performance model serving.
- **Deep Learning Framework**: TensorFlow/Keras for gesture recognition models.
- **Deployment**: Scalable architecture ready for deployment in production.---
## 📈 Model Performance
Our gesture recognition model achieved the following:
- **Accuracy**: 89% on test data.
- **Leaderboard**: Secured **3rd place** on Kaggle during the LUMS AI Hackathon.---
## 🏗️ Installation
### Prerequisites
- Python 3.8 or higher
- Virtual Environment (optional but recommended)### Steps
1. **Clone the Repository**:
```bash
git clone https://github.com/alihassanml/LUMS-AI-Hackthon.git
cd LUMS-AI-Hackthon
```2. **Install Dependencies**:
```bash
pip install -r requirements.txt
```3. **Run the Application**:
```bash
streamlit run app.py
```4. **Access the Application**:
- Open your browser and navigate to `http://localhost:8501`.---
## 🧠 How It Works
1. **Data Collection**: Captures a sequence of 100 frames for each gesture.
2. **Model Training**: Utilizes LSTM and GRU networks to handle sequential data and classify gestures.
3. **Real-Time Prediction**: Inference pipeline powered by Groq API ensures fast and accurate predictions.---
## 🤝 Contributions
We welcome contributions! If you'd like to improve this project, feel free to:
1. Fork the repository.
2. Make your changes.
3. Submit a pull request.---
## 📜 License
This project is licensed under the MIT License. See the [LICENSE](LICENSE) file for more details.---
## 🌟 Acknowledgments
Special thanks to the **LUMS AI Hackathon** organizers and my incredible team for their collaborative efforts.For more details, contact [Ali Hassan](https://github.com/alihassanml).