https://github.com/0xnaman1/american-sign-language-detection-using-computer-vision
The project is about translating American Sign Language into English language. It uses Computer Vision and Deep Learning to predict the ASL alphabet and forms sentences on the basis of prediction. It uses text to speech to convert the predicted word into speech. The project was implemented at MNNIT Hack36 Allahabad Hackathon.
https://github.com/0xnaman1/american-sign-language-detection-using-computer-vision
computer-vision deeplearning keras machine-learning neural-network opencv tensorflow texttospeech
Last synced: 5 days ago
JSON representation
The project is about translating American Sign Language into English language. It uses Computer Vision and Deep Learning to predict the ASL alphabet and forms sentences on the basis of prediction. It uses text to speech to convert the predicted word into speech. The project was implemented at MNNIT Hack36 Allahabad Hackathon.
- Host: GitHub
- URL: https://github.com/0xnaman1/american-sign-language-detection-using-computer-vision
- Owner: 0xnaman1
- Created: 2020-02-14T18:45:14.000Z (over 5 years ago)
- Default Branch: master
- Last Pushed: 2022-12-08T04:07:16.000Z (over 2 years ago)
- Last Synced: 2025-02-28T10:48:18.898Z (4 months ago)
- Topics: computer-vision, deeplearning, keras, machine-learning, neural-network, opencv, tensorflow, texttospeech
- Language: Jupyter Notebook
- Homepage: https://www.youtube.com/watch?v=mIyWNsGfHAQ
- Size: 35 MB
- Stars: 19
- Watchers: 3
- Forks: 11
- Open Issues: 15
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Hack36_Project_Team MISFITS
Hack36 MNNIT Allahabad Hackathon Submission
# American Sign Language Detection using Deep Neural Networks
American Sign Language (ASL) is a visual language. With signing, the brain processes linguistic information through the eyes. The shape, placement, and movement of the hands, as well as facial expressions and body movements, all play important parts in conveying information. It is the primary language of many North Americans who are deaf and hard of hearing, and is used by many hearing people as well. The project can be used by dumb people to easily communicate with people who doesn't understand sign language.
## Formation of message using finger spelling in ASL
Fingerspelling is part of ASL and is used to spell out English words. In the fingerspelled alphabet, each letter corresponds to a distinct handshape. Fingerspelling is often used for proper names or to indicate the English word for something. We are using **Convolutional Neural Networks** to predict the sign language letter and combine those predicted letters to form the sentence to be conveyed.
The message will then be converted from text to speech using Python's built-in support.
The input will be provided in real time using the webcam.## Data Source
https://www.kaggle.com/grassknoted/asl-alphabet## Project Demo
https://www.youtube.com/watch?v=mIyWNsGfHAQ### Prediction of Alphabet A

### Prediction of Alphabet L

### Final Message
