Skip to content

Repository for fast implicit representations for plethysmograph signals

Notifications You must be signed in to change notification settings

UCLA-VMG/FastImplicitPleth

Repository files navigation

FastImplicitPleth

Repository for fast implicit representations for Plethysmography signals.

This code has been tested on Ubuntu 20.04

Requirements (Most Relevant)

Code References

Implicit Neural Models to Extract Heart Rate from Video

Implicit Neural Models to Extract Heart Rate from Video
Pradyumna Chari, Anirudh Bindiganavale Harish, Adnan Armouti, Alexander Vilesov, Sanjit Sarda, Laleh Jalilian, Achuta Kadambi

For details on the citation format, kindly refer to the Citation section below.


Dataset and Pre-prep

The FastImplicitPleth dataset can be downloaded by filling this Google Form.

If you choose to collect your own data, use a face cropping software (MTCNN in our case) to crop the face and save each frame as an image within the trial/volunteer's folder to the following pre-processing instructions to obtain a similar dataset to the FastImplicitPleth dataset.

Hierarchy of the FastImplicitPleth dataset - RGB Files

|
|--- rgb_files
|        |
|        |--- volunteer id 1 trial 1 (v_1_1)
|        |         |
|        |         |--- frame 0 (rgbd_rgb_0.png)
|        |         |--- frame 1 (rgbd_rgb_1.png)
|        |         |
|        |         |
|        |         |
|        |         |--- last frame (rgbd_rgb_899.png)
|        |         |--- ground truth PPG (rgbd_ppg.npy)
|        | 
|        | 
|        |--- volunteer id 1 trial 2 (v_1_2)
|        | 
|        | 
|        | 
|        |--- volunteer id 2 trial 1 (v_2_1)
|        |
|        |
|        |
|
|
|--- fitzpatrick labels file (fitzpatrick_labels.pkl)
|--- folds pickle file (demo_fold.pkl)

NNDL Execution

Before running the following commands, ensure that the configurations are flags are correctly set to run with your environment set up.

In particular, pay particular attention to configs/dataset/ch_appearance_{set}.json -> checkpoints.dir, and configs/dataset/residual_{set}.json -> checkpoints.dir; appearance_model.

  1. Run python auto_dataset_appearance.py
  2. Run python auto_dataset_residual.py
  3. Run inference.ipynb

Citation

@inproceedings{chari2024implicit,
  title={Implicit Neural Models to Extract Heart Rate from Video},
  author={Chari, Pradyumna and Harish, Anirudh Bindiganavale and Armouti, Adnan and Vilesov, Alexander and Sarda, Sanjit and Jalilian, Laleh and Kadambi, Achuta},
  booktitle={European conference on computer vision},
  year={2024},
  organization={Springer}
}

About

Repository for fast implicit representations for plethysmograph signals

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published