Emotion prediction accuracy #334
Replies: 4 comments 3 replies
-
yes, GEAR model is supported for emotion. i can provide some config examples later.
high resolution helps up to a point where cropped face image is still of sufficient resolution, basically 256x256, otherwise lighting and clarity means much more.
human does all necessary preprocessing (crop, resize, grayscale, normalize), but...i never had enough time to optimize cropping factor for each model and it matters a lot. e.g., should image for GEAR have a tight face crop or slightly enlarged? its all configurable, but values are tuned only for some models and emotion is not one of them.
its only supported for initial face detection, not for subsequent per-model specific processing. but even first step is sufficient to visualize cropping. |
Beta Was this translation helpful? Give feedback.
-
I had GEAR emotion support baked in, but I never published actual models. Its done now and And wrote a short test that uses different emotion detection variations:
Face detection is 256x256, but subsequent Face mesh is 192x192. If you don't need mesh results, you can run with mesh enabled or disabled and compare results.
Cropping happens during face detection. All subsequent modules do additional processing, but cannot expand face if it is already cropped :) Demo I've provided shows how to specify crop scale factor in config and you can experiment with different values - you'll see how massive on an impact it has. |
Beta Was this translation helpful? Give feedback.
-
its not on a roadmap as emotion in general was not my highest priority. but honestly, if model is "sane", it should be quite easy to integrate it. i'll take a look next week (and i do need to do a slight update to custom module guideline wiki, but nothing major). i'm going to convert this discussion to a feature request #335 so its easier to track. |
Beta Was this translation helpful? Give feedback.
-
issue #335 is updated |
Beta Was this translation helpful? Give feedback.
-
Hi Vlad!
I'm using human in the browser and currently experiencing a very bad accuracy for emotion detection.
Using blazeface default model and the distance to the webcam is always around 2-3 meters
I'd like to know if there's an alternative model for emotion detection supported by human taking into account Oarriaga Emotion model has %66 test accuracy. In your model's wiki page you mention GEAR as an alternative. Is it possible to experiment with the GEAR emotion model instead of Oarriaga's for emotion? Looking at GEAR repo I can see the emotion classifier is independent of the GenderAgeRace model but not sure if the emotion model is supported by human or not.
Some additional questions:
Thank you!
Beta Was this translation helpful? Give feedback.
All reactions