Skip to content
/ whoopnet Public

enabling tiny FPV drones to leverage offboard neural networks

License

Notifications You must be signed in to change notification settings

nfreq/whoopnet

Repository files navigation

Whoopnet

Whoopnet enables tiny FPV drones to operate autonomously by leveraging offboard neural networks. It facilitates the processing of raw visual and inertial data while providing the interface to deliver RC flight commands for classical and ai driven control.

While compatible with any unmodified FPV multirotor that supports ELRS, HDZero and Betaflight, the project focuses on targeting sub-25-gram vehicles and advancing capabilities via offboard compute.

This allows a 20g robot to have a 20kg brain.

Read the docs.


Goals

  • Stable State Estimation with VINS (Visual-Inertial Navigation)
  • Reproduce Deep Drone Acrobatics with a tiny whoop and offboard compute.

whoopnet_hw whoopnet_hw
whoopnet_hw
io

Challenges

  • Un-syncronized Camera and IMU feeds
  • Low Frequency IMU (~100hz)
  • Rolling Shutter Camera
  • ELRS Telemetry Bandwidth
  • Noise
  • Murphy

Strategy

  • Synchronize IMU and Camera via FC timestamp over OSD and telemetry
  • Betaflight Modifications:
    1. Disable existing telemetry. Send raw IMU CRSF telemetry packets @ ~100hz (with FC timestamp)
    2. Increase OSD refresh rate to 30hz and send FC timestamp (receive and extract via OCR)
  • ELRS v3 F1000HZ to support the telemetry bandwidth requirements
  • HDZero Video System (low and fixed latency)

About

enabling tiny FPV drones to leverage offboard neural networks

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published