This documentation provides more details on procedurally synthesize interaction with new shapes.
For the following, please make sure you have installed the dependencies properly and downloaded the quick start demo data.
We use shapes form ShapeNet, Objaverse, and ABO dataset. Depending on your interests, you may download one or many of these datasets:
- ShapeNet: Download from here. Unzip and change
SHAPENET_ROOT
inpaths.py
to your local path. - Objavers: No need to pre-download, just do
pip install objaverse
and objects will be downloaded automatically when running the rendering script. Warning: objaverse downloads are saved to home directory (~/.objaverse
), modify the BASE_PATH inobjaverse.__init__.py
if that is not desired. - ABO: Download from ABO website. We use only the 3D models. Download, unzip, and modify
ABO_ROOT
inpaths.py
to local path.
Once the corresponding files are downloaded, you can re-render the corresponding ProciGen sequences using python render/blender_rerender.py
. Example command to render 5 frames of the monitor sequence (shapes from ShapeNet):
python render/blender_rerender.py -s /path/to/procigen/Date04_Subxx_monitor_synzv2-01 -fe 5
Data: download the quick start data. We prepare data for more objects, download them from here. Once download, replace the demo assets with full assets:
rm -r example/assets
unzip ProciGen-assets.zip -d example/assets
rm ProciGen-assets.zip
With this full assets, you can do synthesize for other objects. Example commands for synthesize and rendering:
# Use chair from shapenet and interaction from behave chairwood
python synz/synz_batch.py -src shapenet --object_category chair -obj chairblack -s "*chairblack*" -o <your params output>
python render/render_hoi.py -p <your params output> -src objaverse --obj_name chairblack -o <your render output>
# Use table from ABO and interaction from behave tablesmall sequences
python synz/synz_batch.py -src abo --object_category abo-table -obj tablesmall -s "*tablesmall*" -o <your params output>
python render/render_hoi.py -p <your params output> -src abo --obj_name tablesmall -o <your render output>
# Use box from objaverse and interaction from behave boxlarge sequences
python synz/synz_batch.py -src objaverse --object_category box -obj boxlarge -s "*boxlarge*" -o <your params output>
python render/render_hoi.py -p <your params output> -src objaverse --obj_name boxlarge -o <your render output>
Note that the examples above sample interaction from BEHAVE 30fps data, which does not include the basketball and keyboard sequences. For these two objects, you can download the first version of BEHAVE, from this url.
train AE for your own object shapes, process the shapes. coming soon...