Open access
Date
2020Type
- Conference Paper
Abstract
In this work, we present a perception-aware path-planning pipeline for Unmanned Aerial Vehicles (UAVs) for navigation in challenging environments. The objective is to reach a given destination safely and accurately by relying on monocular camera-based state estimators, such as Keyframe-based Visual-Inertial Odometry (VIO) systems. Motivated by the recent advances in semantic segmentation using deep learning, our path-planning architecture takes into consideration the semantic classes of parts of the scene that are perceptually more informative than others. This work proposes a planning strategy capable of avoiding both texture-less regions and problematic areas, such as lakes and oceans, that may cause large drift or failures in the robot’s pose estimation, by using the semantic information to compute the next best action with respect to perception quality. We design a hierarchical planner, composed of an A* path-search step followed by B-Spline trajectory optimization. While the A* steers the UAV towards informative areas, the optimizer keeps the most promising landmarks in the camera’s field of view. We extensively evaluate our approach in a set of photo-realistic simulations, showing a remarkable improvement with respect to the state-of-the-art in active perception. Show more
Permanent link
https://doi.org/10.3929/ethz-b-000441269Publication status
publishedExternal links
Book title
2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)Pages / Article No.
Publisher
IEEEEvent
Subject
ROBOTICS; Path Planning; Visual Inertial OdometryOrganisational unit
09559 - Chli, Margarita (ehemalig) / Chli, Margarita (former)
Related publications and datasets
Is part of: https://doi.org/10.3929/ethz-b-000618140
Notes
Conference lecture held on October 27, 2020. Due to the Coronavirus (COVID-19) the conference was conducted virtually.More
Show all metadata