Show simple item record

dc.contributor.author
Bartolomei, Luca
dc.contributor.supervisor
Chli, Margarita
dc.contributor.supervisor
Alexis, Konstantinos
dc.contributor.supervisor
Scherer, Sebastian
dc.date.accessioned
2023-06-26T11:18:53Z
dc.date.available
2023-06-24T13:13:24Z
dc.date.available
2023-06-26T11:18:53Z
dc.date.issued
2023
dc.identifier.uri
http://hdl.handle.net/20.500.11850/618140
dc.identifier.doi
10.3929/ethz-b-000618140
dc.description.abstract
Unmanned Aerial Vehicles (UAVs), more commonly known as drones, have the potential to profoundly impact numerous applications, such as inspection, post-disaster assessment and Search and Rescue (SaR), thanks to their agility, which translates to a unique ability to move freely in 3D space, and their continuously decreasing cost. However, despite the recent technological advances, commercially available drones are either flown manually by an experienced pilot or in semi-autonomous mode largely based on GPS. In an effort to automate the aforementioned tasks using small rotorcraft UAVs, the research community has been focusing on improving their capacity to navigate unknown environments autonomously, while relying on onboard sensing for pose estimation, mapping and path planning. More advanced extensions deploying multiple robots have also been proposed, aiming at boosting the effectiveness of robotic missions further, as efficiency is particularly crucial in time-critical applications, such as rescue operations. However, while accelerating missions massively, the deployment of multiple UAVs entails a series of difficulties in terms of co-localization and coordination of the swarm. Inspired by these challenges, this thesis begins by addressing the problem of multi-robot collaboration for inspection and exploration tasks, while in the second part it concentrates on the problem of perception-aware navigation, also known in the literature as active planning. Aiming at relaxing the assumptions typical in multi-robot planning, such as the availability of the agents' ground-truth poses, as well as of a prior map of the environment, the first approach proposes a centralized multi-robot architecture, encompassing state estimation, dense mapping and multi-agent coordination. The objective is the generation of a consistent 3D map of a large-scale structure of interest using a team of drones flying a pre-defined coarse mission plan. However, although proposing a complete, applicable solution, this approach requires initial guidance from a human operator. Addressing this limitation, in a follow-up approach, a decentralized coordination strategy for automatic exploration of forests is proposed. As this type of scenery is common in SaR, minimizing the time required to complete the coverage of the area of interest is vital; therefore, here, we propose an efficient strategy able to exploit the agility of UAVs, keeping consistently high flight velocities throughout the mission, despite potentially high density of obstacles and large number of occlusions in the environment. Although exhibiting promising performance, the two approaches rely on the assumption that the poses of the UAV-robots can be estimated directly and accurately by sensors, such as GPS, which can fail in a number of situations, for example, when near to structures or close to buildings. A well-established alternative to GPS-based localization in UAV navigation literature is Visual-Inertial Simultaneous Localization And Mapping (VI-SLAM), where the robot's poses are estimated using sequences of images and high-rate inertial measurements. On the other hand, VI-SLAM is sensitive to the motions of the UAV, and the performance of camera-based state estimation is strongly linked to the visual appearance of the environment in general. Driven by these limitations, the second part of this thesis focuses on the problem of perception-aware path planning, with the objective of minimizing the error in pose estimation by generating camera-aware motions. Inspired by the maturity of semantic segmentation in the Computer Vision literature, partitioning the scene into semantically meaningful clusters of pixels, this thesis addresses the problem of active planning using semantic cues. In contrast to the large body of research carried out in the autonomous driving literature, to the best of our knowledge, this was the first time a semantic-aware active perception approach for UAVs was presented. Guided by semantic scene annotation, the first proposed approach for active planning encourages the robot to navigate over visually-reliable regions, e.g. solid scene structures, such as buildings, while avoiding perceptually degraded areas, characterized, for example, by high dynamism or reflective surfaces, such as water basins. This concept of using semantics for reliable vision-based navigation is pushed forward in a follow-up approach, where a deep Reinforcement Learning (RL) policy is trained to identify useful areas for VI-SLAM online during mission execution. This allows the UAV to adapt its future trajectory dynamically to the scene in view, improving the performance in terms of robustness and localization error. Nevertheless, during deployment in real missions, small UAVs are susceptible to a series of possible threats, such as strong wind gusts or sensor faults, potentially leading to crashes. Thus, in these situations it is fundamental to render them capable of finding a suitable landing spot where to land autonomously. In an effort of ensuring the safety of the robot and the landscape, especially in urban areas, this thesis closes with a semantic-aware approach for autonomous emergency landing of multi-rotor UAVs. Here, following a deep-RL paradigm, we demonstrate that semantic information allows to find a landing spot faster by exploiting the high-level spatial associations between semantic classes (e.g. cars and roads). The proposed pipeline can be directly deployed in real-world experiments immediately after policy training, without additional fine-tuning or domain adaptation. With the focus on multi-robot coordination and perception-aware active planning for UAVs, the approaches and systems presented in this thesis contribute towards autonomous aerial navigation deployable in challenging real-world scenarios. Furthermore, it is demonstrated that the use of semantic segmentation can be extremely beneficial for path-planning purposes during vision-based flights and autonomous emergency landing. This leads to more robust methods, able to succeed where state-of-the-art systems fail, paving the way towards more reliable autonomous navigation of robotic agents.
en_US
dc.format
application/pdf
en_US
dc.language.iso
en
en_US
dc.publisher
ETH Zurich
en_US
dc.rights.uri
http://rightsstatements.org/page/InC-NC/1.0/
dc.subject
Robotics (cs.RO)
en_US
dc.subject
Path Planning
en_US
dc.subject
Multi-agent systems
en_US
dc.title
Towards Robust Active Planning for Autonomous Aerial Navigation
en_US
dc.type
Doctoral Thesis
dc.rights.license
In Copyright - Non-Commercial Use Permitted
dc.date.published
2023-06-26
ethz.size
141 p.
en_US
ethz.code.ddc
DDC - DDC::0 - Computer science, information & general works::004 - Data processing, computer science
en_US
ethz.identifier.diss
29282
en_US
ethz.publication.place
Zurich
en_US
ethz.publication.status
published
en_US
ethz.leitzahl
ETH Zürich::00002 - ETH Zürich::00012 - Lehre und Forschung::00007 - Departemente::02130 - Dep. Maschinenbau und Verfahrenstechnik / Dep. of Mechanical and Process Eng.::02620 - Inst. f. Robotik u. Intelligente Systeme / Inst. Robotics and Intelligent Systems::09559 - Chli, Margarita (ehemalig) / Chli, Margarita (former)
en_US
ethz.relation.hasPart
10.3929/ethz-b-000441280
ethz.relation.hasPart
10.48550/arXiv.2301.08537
ethz.relation.hasPart
10.3929/ethz-b-000441269
ethz.relation.hasPart
10.3929/ethz-b-000500873
ethz.relation.hasPart
10.3929/ethz-b-000560674
ethz.date.deposited
2023-06-24T13:13:25Z
ethz.source
FORM
ethz.eth
yes
en_US
ethz.availability
Open access
en_US
ethz.rosetta.installDate
2023-06-26T11:19:35Z
ethz.rosetta.lastUpdated
2023-06-26T11:19:35Z
ethz.rosetta.versionExported
true
ethz.COinS
ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.atitle=Towards%20Robust%20Active%20Planning%20for%20Autonomous%20Aerial%20Navigation&rft.date=2023&rft.au=Bartolomei,%20Luca&rft.genre=unknown&rft.btitle=Towards%20Robust%20Active%20Planning%20for%20Autonomous%20Aerial%20Navigation
 Search print copy at ETH Library

Files in this item

Thumbnail

Publication type

Show simple item record