Skip to content
forked from hangqiu/AutoCast

AutoCast: Scalable Infrastructure-less Cooperative Perception for Distributed Collaborative Driving

Notifications You must be signed in to change notification settings

USC-NSL/AutoCast

 
 

Repository files navigation

AutoCast

AutoCast is an end-to-end autonomous system that enables scalable infrastructure-less cooperative perception using direct vehicle-to-vehicle (V2V) communication. Using limited V2V bandwidth, AutoCast can easily eliminate safety hazards by coordinating autonomous vehicles to cast objects in the occluded/invisible area to their peer receipients' perspective. It carefully determines which objects to share based on positional relationships between traffic participants, and the time evolution of their trajectories. It coordinates vehicles and optimally schedules transmissions in a scalable and distributed fashion.

AutoCast: Scalable Infrastructure-less Cooperative Perception for Distributed Collaborative Driving
Hang Qiu, Po-Han Huang, Namo Asavisanu, Xiaochen Liu, Konstantinos Psounis, Ramesh Govindan
ACM Mobisys 2022

Paper | Website | Demo | Bibtex

DOI

Getting Started

Prerequisites

Ubuntu 20.04, Cuda 11.0, PyTorch == 1.7.1, PyTorch Geometrics, Carla 0.9.11, Minkowski Engine (CPU version is sufficient)

Installation

Clone the AutoCast repo with AutoCastSim submodule.

git clone --recursive [email protected]:hangqiu/AutoCast.git
git submodule update --remote

Installing dependencies and config paths

apt-get install mosquitto libopenblas-dev
apt remove python3-networkx
pip3 install requirements.txt
export SIM_ROOT=${PWD}/AutoCastSim
export SCENARIO_RUNNER_ROOT=${PWD}/AutoCastSim/srunner
export PYTHONPATH=${PYTHONPATH}:${SCENARIO_RUNNER_ROOT}:${SIM_ROOT}

Running Demo Scenarios

Overtaking Unprotected Left-turn Red-light Violation

Run Carla

bash run_carla.sh [GPU_ID] [CARLA_PORT]

In a different terminal, run test scenarios, and follow the prompt to select scenario ID (Overtake(6), Left-turn(8), Red-light(10), Static(test))

bash run_test.sh [CARLA_PORT]

Running with Specific Configuration

The specific scripts to run each scenario is stored in run_test.sh. For example, to run the red-light violation scenario:

python3 scenario_runner.py \
  --route srunner/data/routes_training_town03_autocast10.xml \
  srunner/data/towns03_traffic_scenarios_autocast10.json   \
  --reloadWorld \
  --bgtraffic 0 \
  --agent AVR/autocast_agents/simple_agent.py \
  [--hud --sharing]

The scenarios are configured using flags. E.g. to enable sharing mode, add --sharing; to increase the traffic density (simulation does get slower!), change the --bgtraffic variable from 0 to 30.

For more detailed scenario configuration, please see config.md and instructions here

Running with Real Radios

We provide scripts to run with special hardware (e.g iSmartways V2V radios). If you are interested and have a similar setup, please contact us for a custom setup.

Training Your Cooperative Driving Agent

Please refer to Coopernaut as an example agent trained using AutoCast. We provide a training dataset for behavior cloning based on a simple rule-based agent. You can also collect your own data and train your own agents following these instructions.

Citation

@inproceedings{autocast,
  title={AutoCast: Scalable Infrastructure-less Cooperative Perception for Distributed Collaborative Driving},
  author={Hang Qiu and Pohan Huang and Namo Asavisanu and Xiaochen Liu and Konstantinos Psounis and Ramesh Govindan},
  booktitle = {Proceedings of the 20th Annual International Conference on Mobile Systems, Applications, and Services},
  series = {MobiSys '22},
  year={2022},
}

About

AutoCast: Scalable Infrastructure-less Cooperative Perception for Distributed Collaborative Driving

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • C 89.1%
  • CSS 5.5%
  • Shell 2.7%
  • JavaScript 1.6%
  • Python 0.8%
  • Makefile 0.2%
  • CMake 0.1%