https://github.com/julienmalka/classreport
LauzHack 2018 Project
https://github.com/julienmalka/classreport
computer hands-raised motivation students vision
Last synced: 7 months ago
JSON representation
LauzHack 2018 Project
- Host: GitHub
- URL: https://github.com/julienmalka/classreport
- Owner: JulienMalka
- License: mit
- Created: 2018-11-26T08:48:03.000Z (almost 7 years ago)
- Default Branch: master
- Last Pushed: 2018-11-26T08:48:31.000Z (almost 7 years ago)
- Last Synced: 2025-01-14T18:05:11.859Z (9 months ago)
- Topics: computer, hands-raised, motivation, students, vision
- Language: CSS
- Homepage:
- Size: 777 KB
- Stars: 0
- Watchers: 3
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# ClassReport
_Lauzhack 2018 project, made by Ulysse Ramage, Jean Chambras, Julien Malka and Gaspard Peduzzi_## Table of Contents
- [Installation](#installation)
- [Motivations](#motivations)
- [Technical Considerations](#tech)## Installation
### Dependencies :gear:
* In order to be able to run the project you need `python v3.5` and `node v10` with `npm` and `pip3`
* You need to create a `config.js` file at the following path `client/src/api/config.js` with this content :```
export default {
// Your endpoint for Microsoft Azure
faceUri: "https://westcentralus.api.cognitive.microsoft.com/face/v1.0/detect",
// Your subscriptions key for this service
faceKey: "xxxxxxxxxxxxxxxxxxxxxxx",
// The endpoint of your python server used for handraising recognition
handUri: "http://xxx.xxx.xxx.xxx:5000/hand-raised"
}
```### Launch the project :rocket:
Start the __server__ project with:
```sh
cd server
./getModels.sh
pip3 install -r requirements.txt
python3 server.py
```Start the __application__ project in an other window with:
```sh
cd client
npm install
npm run start
```## Motivations :mortar_board:
The goal was to create a very **intuitive dashboard** in the time available where a speaker (teacher, speaker, etc.) could understand and be informed of the **audience's interactions**. They are categorized into two categories:
- **The hands raised**, allowing to directly conduct polls with this interaction, or to note the participation of people in relation to the timeline of the session
- **People's expressions**, analyzing the following feelings in all people at every moment of the experience: _anger, contempt, disgust, fear, happiness, neutral, sadness, surprise_ :relieved: :neutral_face: :worried:## Tech
This project uses state of the art methods in the fields of computer vision and machine learning to be able to identify emotions of the students and raising hand gestures.
For the latter, we've been using OpenPose to compute a pose estimation of the students in the room and hence know if they're raising their hand.
![]()