https://github.com/harshitwaldia/face_emotion_recognition
Face emotion recognition uses computer vision to identify emotions displayed on human faces, analyzing expressions like happiness, sadness, anger, and more.
https://github.com/harshitwaldia/face_emotion_recognition
Last synced: about 1 month ago
JSON representation
Face emotion recognition uses computer vision to identify emotions displayed on human faces, analyzing expressions like happiness, sadness, anger, and more.
- Host: GitHub
- URL: https://github.com/harshitwaldia/face_emotion_recognition
- Owner: HarshitWaldia
- License: cc0-1.0
- Created: 2023-12-24T09:25:30.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2024-07-01T18:07:26.000Z (10 months ago)
- Last Synced: 2025-02-12T14:23:58.126Z (3 months ago)
- Language: Jupyter Notebook
- Homepage:
- Size: 76.2 MB
- Stars: 2
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE.txt
Awesome Lists containing this project
README
# Face Emotion Recognition
This project utilizes the FER-2013 dataset to recognize facial emotions. It consists of two main components:
1. Real-Time Recognition: Real-time detection and recognition of facial emotions.
2. Input-Based Recognition: Allows users to select an image or video file from their system for emotion recognition.## Dataset
The FER-2013 dataset is a publicly available dataset containing grayscale images of faces labeled with one of seven emotions: angry, disgust, fear, happy, sad, surprise, or neutral. You can find more information and download the dataset [here](https://www.kaggle.com/datasets/msambare/fer2013).
## Model
The model used in this project achieves an accuracy of approximately 65%. You can download the model from [here](https://github.com/HarshitWaldia/Face_Emotion_Recognition/tree/main/Emotion-Model).
## Model Graph

## Usage
### Real-Time Recognition
1. Navigate to the `realtime` directory : [Real-Time](https://github.com/HarshitWaldia/Face_Emotion_Recognition/blob/main/Real_Time_Emotion_Detection.ipynb)
2. Run the real-time recognition script.
3. The script will start a real-time video feed, detecting and recognizing emotions in faces.### Input-Based Recognition
1. Navigate to the `input_based` directory : [Input-Based](https://github.com/HarshitWaldia/Face_Emotion_Recognition/blob/main/Input_based_Emotion_Detection.ipynb)
2. Run the input-based recognition script.
3. Follow the instructions to select an image or video file from your system for emotion recognition.## Personal Test Images and Videos
Please note that the project repository contains personal test images and videos. These are intended for demonstration purposes only and should not be used for any other works without proper permission.
## Report
For more details about the project, including methodology, results, and analysis, please refer to the [Project Report](https://github.com/HarshitWaldia/Face_Emotion_Recognition/blob/main/Project_Report_FER.pdf).
## License
This project is licensed under the [CC0 1.0 Universal (CC0 1.0) License](https://choosealicense.com/licenses/cc0-1.0/). See the [LICENSE.txt](https://github.com/HarshitWaldia/Face_Emotion_Recognition/blob/main/LICENSE.txt) file for details.
### Sample Outputs
![]()
![]()
![]()
## Contribution
Contributions to this project are welcome! If you find any issues or have suggestions for improvements, please open an issue or submit a pull request.