Dean Machines is developing an open-source autonomous drone platform using a 5" FPV Racing drone equipped with advanced sensors and AI capabilities. Our mission is to create a comprehensive dataset for drone autonomy research.
- Drone Platform: 5" FPV Racing Drone
- AI Computer: NVIDIA Jetson Orin Nano
- Sensors:
- TFmini-S LiDAR
- AI-enabled Camera Module
- IMU/Gyroscope
- GPS Module
- Communication:
- YHY 9800 Eng D SDR
- 433MHz Receiver
- FPV Video Receiver
- 30ft Monopole Antenna
We're building a standardized dataset including:
- Visual Data (RGB + Depth)
- LiDAR Point Clouds
- Radio Telemetry
- Flight Controller Data
- Environmental Metrics
- NVIDIA Orin Nano for real-time processing
- Custom sensor fusion pipeline
- Edge AI inference
- Real-time telemetry streaming
- SDR signal processing
- Real-time data visualization
- Neural network training
- Dataset validation tools
- Hardware Compatibility
- Sensor Calibration
- Data Format Standards
- Quality Metrics
- Validation Process
- 5" FPV Racing Drone Frame
- NVIDIA Jetson Orin Nano
- TFmini-S LiDAR Sensor
- HD FPV Camera (>720p)
- GPS Module (uBlox NEO-M8N or better)
- IMU (MPU6050 or better)
- SDR (YHY 9800 or compatible)
interface DroneDataPoint {
timestamp: number; // Unix timestamp (ms)
gps: {
lat: number; // Latitude
lon: number; // Longitude
alt: number; // Altitude (m)
accuracy: number; // GPS accuracy (m)
};
imu: {
acceleration: Vec3; // m/s²
gyroscope: Vec3; // rad/s
magnetometer: Vec3; // μT
};
lidar: {
distance: number; // Distance in meters
strength: number; // Signal strength
};
camera: {
resolution: string; // "1280x720"
fps: number; // Frames per second
format: string; // "h264"
};
radio: {
frequency: number; // MHz
signalStrength: number; // dBm
bandwidth: number; // MHz
};
}
-
Data Collection
- Minimum 10 minutes of continuous flight
- Various flight patterns required:
- Hover
- Forward flight
- Figure-8
- Obstacle navigation
-
Data Validation
# Validate dataset structure npm run validate-dataset path/to/data # Generate validation report npm run generate-report
-
Submission Format
dataset/ ├── metadata.json # Flight information ├── raw/ # Raw sensor data │ ├── camera/ # Video streams │ ├── lidar/ # LiDAR point clouds │ ├── imu/ # IMU readings │ └── radio/ # SDR captures └── processed/ # Processed data ├── trajectory/ # Flight path ├── obstacles/ # Detected obstacles └── annotations/ # Manual annotations
- Camera: 720p minimum at 30fps
- LiDAR: 100Hz minimum sampling rate
- IMU: 200Hz minimum sampling rate
- GPS: 10Hz minimum update rate
- Radio: 433MHz band captures at 2MSPS
- Automated validation check
- Data quality assessment
- Manual review by core team
- Integration into main dataset
View Full Documentation on our GitHub Wiki
This project includes the following components:
SensorDataDisplay
: Displays sensor data in a consistent format, and supports units, different display types, and text sizes.DroneMap
: Displays a map with the drone's location, and supports displaying a path of previous locations and different map types.TelemetryDisplay
: Displays telemetry data and supports a custom color for the value text.LidarDataDisplay
: Displays lidar data as a 3D scatter plot with a dropdown menu to select different color scales.IMUDataDisplay
: Displays IMU data as a 3D scatter plot with a dropdown menu to select different data types to display.FPVVideoDisplay
: Displays a video stream from the FPV drone, with a loading state, error handling, and a responsive video element. The FPV page also includes an input box to allow the user to enter a video URL and save it for later use, and checkboxes for advanced options for receiving and displaying FPV video in real time.
# Website Development (Next.js)
npm run dev
# Python CV Pipeline (Coming Soon)
python3 setup.py develop
We welcome contributions to Dean Machines! Please follow these steps:
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature
) - Commit your changes (
git commit -m 'Add amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
Read our Contributing Guidelines for detailed information.
Please read our Code of Conduct to keep our community approachable and respectable.
This project is licensed under the MIT License - see the LICENSE file for details