Skip to content

This project aims at building a pipeline that identifies each student/lecturer in the classroom and captures the emotions of each one of them

License

Notifications You must be signed in to change notification settings

apthagowda97/emotion_analysis

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Project Title: OpenVino Emotion Analysis

This project aims at building a pipeline that identifies each student/lecturer in the classroom and captures the emotions of each one of them. In the end, it generates the report card of each student/lecturer which will tell the overall fluctuations in the behaviour of the student/lecturer through the day.

Prerequisites

[Note: Only works in windows not tested for linux/MacOS]

  1. Download and install OpenVino along with other dependency software and run the demo mentioned in the installation guide to check everything installed properly.[Note: I used openvino version 2019_R3.1]
  2. Download and install Anaconda with python 3.x.

Getting Started

  1. Clone this repo
  git clone https://github.com/apthagowda97/openvino_emotion_analysis.git
  1. Open Command Prompt and change the directory to the cloned repository and run the below commands
  C:\> cd [path]\openvino_emotion_analysis
  C:\[path]\openvino_emotion_analysis> run.bat
  C:\[path]\openvino_emotion_analysis> jupyter notebook
  1. run notebooks emotion_analysis.ipynb and emotion_analysis_explained.ipynb

Description

Phase 1:

flowchart

1. Breaks the video data into frames.

video frames

2. Takes singel frame at a time.

frame1

3. Runs the face-detection-retail-0005 to identify different student face.

frame2 faces

4. Runs the age-gender-recognition-retail-0013 to identify the gender of the face.

faces_with gender

5. Runs the emotions-recognition-retail-0003 to recognize 5 emotions [neutral,happy,sadness,surprise,anger].

frame3 emotion

6. Plots the overall emotion of the frame and add that to the frame.

emotion graph frame

7. Output video (gif)

output video

Phase 2:

flowchart

1. All different faces detected by face-reidentification-retail-0095 model.

face_db

2. Face database after clipping those person faces with count below 15.

face_db

3. Emotion analysis report

1.1 Face database of (Face ID : FACE_0)

face_1

1.2 Emotion analysis of (Face ID :FACE_0)

report_1

2.1 Face database of (Face ID :FACE_1)

face_1

2.2 Emotion analysis of (Face ID :FACE_1)

report_1

3.1 Face database of (Face ID :FACE_2)

face_1

3.2 Emotion analysis of (Face ID :FACE_2)

report_1

4.1 Face database of (Face ID :FACE_17)

face_1

4.2 Emotion analysis of (Face ID :FACE_17)

report_1

Uses

  1. It can be installed in a classroom to monitor the real-time behaviour of the students and report those students whose behaviour is below some threshold.
  2. It gives privacy to the students and everything is done at the edge. There is no need to store student face database to recognize. It ll create as it captures a different face and reports only those students whose behaviour is below par.

Author

  • Aptha Gowda

Acknowledgments

  • Netflix for Stranger things Short Clip
  • Intel openvino scholarship

About

This project aims at building a pipeline that identifies each student/lecturer in the classroom and captures the emotions of each one of them

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published