Skip to content

Controlling a drone using deep learning and computer vision to detect arm gestures.

License

Notifications You must be signed in to change notification settings

UAVs-at-Berkeley/flywave

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

25 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Project FlyWave

The goal of the FlyWave team of UAVs@Berkeley is to develop a drone that can maneuver autonomously using computer vision and deep learning to detect hand gestures.

Team

  • Alex Chan (Project Lead)
  • Timothy Liu
  • Raymond Gu
  • Nick Mecklenburg

Highlights

  • Person classification
  • Hand gesture detection
  • Person movement tracking
  • Gesture detection
  • Using gestures to move the drone

Requirements

  1. Parrot Bebop 2
  2. Packages: Tensorflow, Keras, OpenCV, and PyParrot (dependencies described here).

Setup

  1. Connect to the Bebop's Wifi.
  2. Before running any scripts, stand clear of the drone's takeoff area.
  3. In terminal, run:
python drone_movement.py
  1. Stand in front of Bebop, and make gestures as desired. Right arm out: Move bebop in an arc to the right Left arm out: Move bebop in an arc to the left Right arm up: Land drone Left arm up: Flip

Credits

Credit to: PyParrot

Copyright for object detection: See LICENSE for details. Copyright (c) 2017 Dat Tran.

About

Controlling a drone using deep learning and computer vision to detect arm gestures.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •