Open access
Datum
2019-04Typ
- Journal Article
Abstract
Place recognition is an essential capability for
robotic autonomy. While ground robots observe the world from
generally similar viewpoints over repeated visits, other robots,
such as small aircraft, experience far more different viewpoints,
requiring place recognition for images captured from very
wide baselines. While traditional feature-based methods fail
dramatically under extreme viewpoint changes, deep learning
approaches demand heavy runtime processing. Driven by the
need for cheaper alternatives able to run on computationally
restricted platforms, such as small aircraft, this work proposes
a novel real-time pipeline employing depth-completion on sparse
feature maps that are anyway computed during robot localization
and mapping, to enable place recognition at extreme viewpoint
changes. The proposed approach demonstrates unprecedented
precision-recall rates on challenging benchmarking and own
synthetic and real datasets with up to 45◦
difference in viewpoints.
In particular, our synthetic datasets are, to the best of
our knowledge, the first to isolate the challenge of viewpoint
changes for place recognition, addressing a crucial gap in the
literature. All of the new datasets are publicly available to aid
benchmarking. Mehr anzeigen
Persistenter Link
https://doi.org/10.3929/ethz-b-000321767Publikationsstatus
publishedExterne Links
Zeitschrift / Serie
IEEE Robotics and Automation LettersBand
Seiten / Artikelnummer
Verlag
IEEEThema
Aerial Systems: Perception and Autonomy; Visual-Based Navigation; SLAM; Localization; Place RecognitionOrganisationseinheit
09559 - Chli, Margarita (ehemalig) / Chli, Margarita (former)
Förderung
157585 - Collaborative vision-based perception for teams of (aerial) robots (SNF)