This repository has been archived by the owner on Apr 7, 2020. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 2
Weekly Report 02 (04.15. ~ 04.19.)
Mo edited this page Apr 19, 2019
·
1 revision
- Read through the Ni9elf-colab Documentation
- Relevance for the project:
- Real time capable Object Detection from RGB-D data useful for object/floor type detection
- Issues with small objects may not be applicable for mobile robot identifying objects in an enclosed space
- Read through parts of End-to-End Learning of Driving Models with Surround-View Cameras and Route Planners
- Relevance for the project:
- Datasets probably irrelevant
- Driving Model Learning potentially useful
- Cited sources may contain useful information
- Object/Floor Detection and Path Planning for Mobile Robots
- Floor Type & (Canny) Edge Detection for Environment Mapping
- Floor Type & Object Detection to find target waypoints for the mobile robot
- Mobile Robot Driving Model
- Task Planning and Prioritisation Module
- Route Planning Module setting waypoints in the virtual environment map and adjusting route for obstacles
- low-level navigation routine driving to the next/highest priority waypoint
- Read remaining sources
- Create first draft of mission statement