Skip to content

Latest commit

 

History

History
103 lines (70 loc) · 5.81 KB

README.md

File metadata and controls

103 lines (70 loc) · 5.81 KB

Berkeley AUTOLAB's Dex-Net Package

Occured problems

  1. Collision checking does not work between the object and the gripper if you don't have the mesh stored in the .dexnet file! It seems to use that by default to access the mesh files to load into openrave.

Installation and set-up

  1. Make sure you have a working version nvidia-docker installed on your computer.

  2. Clone the dex-net repository from github to a directory of your choice: $WORKING_DIR/dex-net.

  3. Download the data.zip from the SharePoint* and extract it. This directory will be your $DATA_DIR. In case there's not enough space on your hard drive, the data directory can also point to an external harddrive.

  4. Run ./dex-net/build_gpu_docker.sh from $WORKING_DIR

  5. Set the variable PATH_DSET in $WORKING_DIR/dex-net/run_docker.sh to your $DATA_DIR.

  6. Run ./run_docker.sh from $WORKING_DIR/dex-net/. You should now be within the docker container. Your $DATA_DIR should be accessible via "/data" within the docker container.

Recreate the PerfectPredictions

In order to re-create the PerfectPrediction subset with the original pipeline (without the reprojection), run

python tools/render_dataset

this creates a folder /data/Recreated_grasps/ containing a DexNet dataset. To evaluate the performance, you need to use the gqcnn repository.

Reprojections of the Perfect Predictions from variable oblique camera viewpoints

In order to re-create the PerfectPrediction subset with the original pipeline (without the reprojection), run

python tools/open3d_reprojection.py --dir $DATUM_$ELEV --elev $ELEV

$ELEV specifies the elevation angle (= the angle of the new camera direction). Set visualise_mesh = True if you to visualise the mesh after reconstruction. To evaluate the performance, you need to use the gqcnn repository.


(*) In case you want to generate more grasps than the 10 "PerfectPredictions", you need to download the full dexnet_2 meshes dataset into your $DATA_DIR. If you want to compare it to the original dataset, make sure to also download the dexnet_2 images dataset into your $DATA_DIR.



Links

Updated Project website

Documentation

Original Project website

RSS Paper

Updates

As of Jan 1, 2018 the AUTOLAB visualization module uses the trimesh library instead of meshpy. Version mismatches between cloned libraries may lead to exceptions when using the CLI. If you experience visualization errors, please run git pull origin master from the dex-net, meshpy, and visualization repositories and try again.

We are currently working on migrating dex-net to use trimesh and improving the installation procedure. We hope to release a new version by May 2018.

Overview

The dex-net Python package is for opening, reading, and writing HDF5 databases of 3D object models, parallel-jaw grasps, and grasp robustness metrics.

The HDF5 databases can also be used to generate massive datasets associating tuples of point clouds and grasps with binary grasp robustness labels to train Grasp Quality Convolutional Neural Networks (GQ-CNNs) to predict robustness of candidate grasps from point clouds. If you are interested in this functionality, please email Jeff Mahler ([email protected]) with the subject line: "Interested in GQ-CNN Dataset Generation."

This package is part of the Dexterity Network (Dex-Net) project. Created and maintained by the AUTOLAB at UC Berkeley.

Usage

As of Feb. 1, 2018, the code is licensed according to the UC Berkeley Copyright and Disclaimer Notice. The code is available for educational, research, and not-for-profit purposes (for full details, see LICENSE). If you use this code in a publication, please cite:

Mahler, Jeffrey, Jacky Liang, Sherdil Niyaz, Michael Laskey, Richard Doan, Xinyu Liu, Juan Aparicio Ojea, and Ken Goldberg. "Dex-Net 2.0: Deep Learning to Plan Robust Grasps with Synthetic Point Clouds and Analytic Grasp Metrics." Robotics: Science and Systems (2017). Boston, MA.

Datasets

The Dex-Net Object Mesh Dataset v1.1 and Dex-Net 2.0 HDF5 database can be downloaded from the data repository.

Custom datasets can now be generated using the script tools/generate_gqcnn_dataset.py

Parallel-Jaw Grippers

The repository currently supports our custom ABB YuMi gripper. If you are interested in additional parallel-jaw grippers, please email Jeff Mahler ([email protected]) with the subject line: "Interested in Contributing to the Dex-Net Grippers" with a description of the parallel-jaw gripper you'd like to add.

Custom Database Generation

The master Dex-Net API does not support the creation of new databases of objects. If you are interested in using this functionality for research, see the custom-databases branch. However, we cannot provide support at this time.