FacePsy is designed to capture real-time facial behavior primitives as users interact with their mobile devices.
This is the official codebase of the affective mobile sesning system paper FacePsy: An Open-Source Affective Mobile Sensing System - Analyzing Facial Behavior and Head Gesture for Depression Detection in Naturalistic Settings, accepted by ACM International Conference on Mobile Human-Computer Interaction (MobileHCI 2024).
Our work mood detection using pupillary response got accepted at IEEE-EMBS BSN 2024, titled MoodPupilar: Predicting Mood Through Smartphone Detected Pupillary Responses in Naturalistic Settings.
- Real-time facial behavior primitives (e.g. AU, head pose, facial expressions) tracking
- App usage tracking (e.g. screen on/off, app open/close)
- Cognitive assessment (e.g. Stroop, Visual Spatial Memory, etc.)
- Custom EMA delivery
- Data collection can be done in the background
- Use triggers such as screen on/off, app open/close, etc. to start/stop data collection
- Realtime feature extraction of facial behavior primitives, and stores them remote database
- Reboot app on device restart, app crash, etc, and continue data collection
- Install JDK 1.8, e.g. Amazon Corretto 8
- Install Jetbrains Toolbox
- Install Android Studio 2022.1 via Jetbrains Toolbox.
- Install NDK v20.0.5594570 via Android Studio.
- Install CMake v3.6 via Android Studio.
- Download and extract OpenCV v4.0.1 for Android.
- Open Firebase Console and create a new project.
- Register the android app to your Firebase project.
- Download
google-services.json
file and copy it to./FacePsy/app/
directory. - Go back to your Firebase Project, select
Authentication
, and enableEmail/Password
. - Go back to your Firebase Project, select
Cloud Firestore
and clickCreate Database
.
Replace and modify as needed the rule with following config.
rules_version = '2';
service cloud.firestore {
match /databases/{database}/documents {
match /{document=**} {
allow read, write: if request.auth != null;
}
}
}
- Collection
config
- Document
survey
(db.collection("config").document("survey")
)preLink
: https://example.com/postLink
: https://example.com/
- Document
triggers
(db.collection("config").document("triggers")
)apps
: array of map- [0]
packageName
: "flowerGame"enable
: true / false
- [1]
packageName
: "stroopTask"enable
: true / false
- [0]
- Document
triggerDuration
(db.collection("config").document("triggerDuration")
)app
: 100flowerGame
: 100unlockEvent
: 100stroopTask
: 100
- Document
stroopTask
(db.collection("config").document("stroopTask")
)rounds
: 3
- Document
- Clone the repository
- Before importing the project into AndroidStudio there's a little editing to do:
- Open the
app/CMakeLists.txt
- Then, replace with your path the variables PROJECT_PATH and OPENCV_PATH.
- Open the
- Open the project in Android Studio
- Set the NDK path in
local.properties
file - Set JDK version to 1.8 by going to Preference > Build Tools > Gradle
- Add Firebase to the project (Follow the instructions here)
- Build the project
- Open the app
- Create an account, or login if you already have an account
- App will ask for permissions to access the camera, storage, etc.
- App registers itself in background services, and starts collecting data based on the triggers
The study admin/researcher/developer can access the data collected by the app by logging into the firebase console. The firebase console is available at FacePsy Web Portal
Following parameters can be accessed by the study admin/researcher/developer:
- Customizable triggers
- EMA link and delivery
- Data collection length on each trigger type
- Data collection frequency
Thanks to CottaCush/HiddenCam.
If you find this repository useful, please consider giving a star ⭐ and citation using the given BibTeX entry:
@article{10.1145/3676505,
author = {Islam, Rahul and Bae, Sang Won},
title = {FacePsy: An Open-Source Affective Mobile Sensing System - Analyzing Facial Behavior and Head Gesture for Depression Detection in Naturalistic Settings},
year = {2024},
issue_date = {September 2024},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
volume = {8},
number = {MHCI},
url = {https://doi.org/10.1145/3676505},
doi = {10.1145/3676505},
month = sep,
articleno = {260},
numpages = {32},
keywords = {affective computing, application instrumentation, depression, empirical study that tells us about people, field study, machine learning, mobile computing, system}
}
This project is licensed under the MIT License.
If you have any questions or suggestions, please feel free to contact Rahul.