GATOR: Graph-Aware Transformer with Motion-Disentangled Regression for Human Mesh Reconstruction from a 2D Pose
This repository is the offical implementation of GATOR: Graph-Aware Transformer with Motion-Disentangled Regression for Human Mesh Reconstruction from a 2D Pose (ICASSP 2023). The overall architecture of GATOR is shown below.
- We recommend you to use an Anaconda virtual environment. Install PyTorch >= 1.2 according to your GPU driver and Python >= 3.7.2, and run
sh requirements.sh
.
- Download the pre-trained GATOR models from here.
- Prepare SMPL layer from here.
- Run
python demo/run.py --gpu 0 --input_pose demo/coco_joint_input.npy --joint_set coco
. - The
--input_pose {2d_pose_path}
follows the skeleton topology of--joint_set {coco, human36}
, which can be found in./data/*/dataset.py
. - The outputs will be saved in
./demo/result
.
Here is the performance of GATOR. For Human3.6M benchmark, GATOR is trained on Human3.6M. For 3DPW benchmark, GATOR is trained on Human3.6M, COCO, and MuCo.
MPJPE | PA-MPJPE | |
---|---|---|
Human36M | 64.0 mm | 44.7 mm |
3DPW | 87.5 mm | 56.8 mm |
MPJPE | PA-MPJPE | |
---|---|---|
Human36M | 48.8 mm | 31.2 mm |
3DPW | 50.8 mm | 30.5 mm |
We use the same datasets as Pose2Mesh. Please follow the instructions to perpare datasets and files.
The data
directory structure should follow the below hierarchy.
${ROOT}
|-- data
| |-- base_data
| | |-- J_regressor_extra.npy
| | |-- J_regressor_h36m.npy
| | |-- smpl_mean_params.npz
| | |-- smpl_mean_vertices.npy
| | |-- mesh_downsampling.npz
| | |-- shortest_path_h36m.npy
| | |-- shortest_path_3dpw.npy
| | |-- path_h36m.npy
| | |-- path_3dpw.npy
| |-- Human36M
| | |-- images
| | |-- annotations
| | |-- absnet_output_on_testset.json
| | |-- J_regressor_h36m_correct.npy
| |-- MuCo
| | |-- data
| | | |-- augmented_set
| | | |-- unaugmented_set
| | | |-- MuCo-3DHP.json
| | | |-- smpl_param.json
| |-- COCO
| | |-- images
| | | |-- train2017
| | | |-- val2017
| | |-- annotations
| | |-- J_regressor_coco.npy
| | |-- hrnet_output_on_valset.json
| |-- PW3D
| | |-- data
| | | |-- 3DPW_latest_test.json
| | | |-- 3DPW_latest_train.json
| | | |-- 3DPW_latest_validation.json
| | | |-- darkpose_3dpw_testset_output.json
| | | |-- darkpose_3dpw_validationset_output.json
| | |-- imageFiles
- Download base data [data]
- Download Human3.6M parsed data and SMPL parameters [data][SMPL parameters from SMPLify-X]
- Download MuCo parsed/composited data and SMPL parameters [data][SMPL parameters from SMPLify-X]
- Download COCO SMPL parameters [SMPL parameters from SMPLify]
- Download 3DPW parsed data [data]
- All annotation files follow MS COCO format.
- If you want to add your own dataset, you have to convert it to MS COCO format.
- Images need to to be downloaded, but if needed you can download them from their offical sites.
- 2D pose detection outputs can be downloaded here: Human36M, COCO, 3DPW
- For the SMPL layer, I used smplpytorch. The repo is already included in
${ROOT}/smplpytorch
. - Download
basicModel_f_lbs_10_207_0_v1.0.0.pkl
,basicModel_m_lbs_10_207_0_v1.0.0.pkl
, andbasicModel_neutral_lbs_10_207_0_v1.0.0.pkl
from here (female & male) and here (neutral) to${ROOT}/smplpytorch/smplpytorch/native/models
.
Download pretrained model weights from here to a corresponding directory.
${ROOT}
|-- results
| |-- 3dpw_det.pth.tar
| |-- 3dpw.pth.tar
| |-- h36m_det.pth.tar
| |-- h36m.pth.tar
It is a two-stage training that first pre-trains GAT and then trains the whole GATOR after loading the weights of GAT.
Select the config file in ./asset/yaml/
and train. You can change the train set and pretrained posenet by your own *.yml
file.
1. Pre-train PoseNet
Use the config file gat_*.yml
in ./asset/yaml/
to train GAT.
Run
python main/train.py --gpu {GPU_id} --cfg ./asset/yaml/gat_{input joint set}_train_{dataset list}.yml
2. Train Pose2Mesh
Set GAT weights ./experiment/exp_*/checkpoint/best.pth.tar
to the config file gator_*.yml
in posenet_path
. And set posenet_pretrained
True.
Run
python main/train.py --gpu {GPU_id} --cfg ./asset/yaml/gator_{input joint set}_train_{dataset list}.yml
Select the config file in ./asset/yaml/
and test.
Run
python main/test.py ./asset/yaml/gator_{input joint set}_test_{dataset name}.yml --gpu 0,1,2,3 --cfg
# For example, test 3dpw using detected 2d pose
python ./main/test.py --cfg ./asset/yaml/gator_cocoJ_test_human36_coco_muco_det.yml --gpu 0
Our code is built on the following repositories. We thank the authors for their open source work.