You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello,
First, thank you for a great database, code files and instructions to work with.
I try to reproduce the results from the paper, and I have several questions about things that I quite didn’t understand.
(I have little experience with Deep Learning and working with python, so excuse me if my quotations are silly):
I used the MATLAB files provided in the repo to generate the h5 files.
For each move (in each section) there are 2 files, for example:
"S12_session5_mov4_7500events.h5"
"S12_session5_mov4_7500events_label.h5"
What information each file represents?
What is the exact data the provided DHP_CNN.model process?
I mean, after I generated the h5 files, what are the next steps I need to perform in order to use the DHP_CNN.model and calculate the MPJPE?
How do I need to perform the calculation of the MPJPE considering I have number of frames?
Thank you for your help,
Michael
The text was updated successfully, but these errors were encountered:
Hello,
First, thank you for a great database, code files and instructions to work with.
I try to reproduce the results from the paper, and I have several questions about things that I quite didn’t understand.
(I have little experience with Deep Learning and working with python, so excuse me if my quotations are silly):
For each move (in each section) there are 2 files, for example:
"S12_session5_mov4_7500events.h5"
"S12_session5_mov4_7500events_label.h5"
What information each file represents?
What is the exact data the provided DHP_CNN.model process?
I mean, after I generated the h5 files, what are the next steps I need to perform in order to use the DHP_CNN.model and calculate the MPJPE?
How do I need to perform the calculation of the MPJPE considering I have number of frames?
Thank you for your help,
Michael
The text was updated successfully, but these errors were encountered: