Skip to content

Latest commit

 

History

History
171 lines (136 loc) · 5.48 KB

README.md

File metadata and controls

171 lines (136 loc) · 5.48 KB

RoboDuet: Whole-body Legged Loco-Manipulation with Cross-Embodiment Deployment

cross

This repo is an official PyTorch implementation of our paper "RoboDuet: Whole-body Legged Loco-Manipulation with Cross-Embodiment Deployment". Thanks to the cooperative policy mechanism and two-stage training strategy, the proposed framework demonstrates agile whole-body control and cross-embodiment deployment capabilities. 📺️More demo details can be found on our project page.


Installation

Conda Environment

conda create -n roboduet python=3.8  # python=3.8 is necessary for Isaac Gym
conda activate roboduet

Install Isaac Gym

  1. Download and install Isaac Gym Preview 4 from https://developer.nvidia.com/isaac-gym

  2. unzip the file via:

    tar -xf IsaacGym_Preview_4_Package.tar.gz
  3. now install the python package

    cd isaacgym/python && pip install -e .
  4. Verify the installation by try running an example

    python examples/1080_balls_of_solitude.py
  5. For troubleshooting check docs isaacgym/docs/index.html


Install this repo

git clone https://github.com/locomanip-duet/RoboDuet.git
ch RoboDuet
pip install -r requirements.txt
pip install -e .

Usage

Train

python scripts/auto_train.py --num_envs 4096 --run_name test_roboduet --sim_device cuda:0 --robot go1  # or --robot go2 

you can also use "--headless" to run the simulation without GUI

python scripts/auto_train.py --num_envs 4096 --run_name test_roboduet --sim_device cuda:0 --robot go1 --headless

Play

⌨️Keypoard Control

When we use keyboard to control the robot, the key mapping is as follows:

KeyAction
NUMPAD 8Move Forward
NUMPAD 5Move Backward
NUMPAD 4Move Left
NUMPAD 6Move Right
NUMPAD 7Turn Left
NUMPAD 9Turn Right
UArm Up
OArm Down
IArm Forward
KArm Backward
JArm Left
LArm Right
WArm Pitch Down
SArm Pitch Up
AArm Roll Left
DArm Roll Right
QArm Yaw Left
EArm Yaw Right
RReset
# example:
python scripts/play_by_key.py --logdir runs/test_roboduet/2024-10-13/auto_train/003436.678552_seed9145 --ckptid 40000 --sim_device cuda:0 

vrVR Control

  1. Streaming Meta Quest 3 with ALVR and SteamVR. We mainly use the left handle to control. The key mapping is as follows:
handle

  1. Move scripts\vr_play\vr_streaming.py into your pc for streaming.

  1. modify the ip + port in the vr_streaming.py and remote_pub.py.
    # NOTE This is the ip and port of the pc host connected to vr
    GLOBAL_IP = "192.168.12.198"
    GLOBAL_PORT = "34565"

  1. Run
    # PC
    python scripts/vr_play/vr_streaming.py
    
    # Training Machine
    ## screen 1
    python scripts/vr_play/play_by_remote.py --logdir runs/test_roboduet/2024-10-13/auto_train/003436.678552_seed9145 --ckptid 40000 --sim_device cuda:0 
    
    ## screen 2
    python scripts/vr_play/remote_pub.py

Enjoy your journey with the legged robot! 🎉️


We will provide deployment code for both the Unitree Go1 EDU and Unitree Go2 EDU robots mounted with ARX5. Additionally, we support using the Meta Quest 3 to control the end-effector pose of the ARX.

Please visit RoboDuet-Deployment for more details.


Acknowledgement

The base implementation is largely borrowed from walk-these-ways, an impressive work that demonstrates robust locomotion with a multiplicity of behaviors (MoB). We are deeply grateful for their contribution to the open-source community.


Citation

@misc{pan2024roboduetwholebodyleggedlocomanipulation,
      title={RoboDuet: Whole-body Legged Loco-Manipulation with Cross-Embodiment Deployment}, 
      author={Guoping Pan and Qingwei Ben and Zhecheng Yuan and Guangqi Jiang and Yandong Ji and Shoujie Li and Jiangmiao Pang and Houde Liu and Huazhe Xu},
      year={2024},
      eprint={2403.17367},
      archivePrefix={arXiv},
      primaryClass={cs.RO},
}