Open access
Datum
2018-10Typ
- Journal Article
Abstract
The recent emergence of bioinspired event cameras has opened up exciting new possibilities in high-frequency tracking, bringing robustness to common problems in traditional vision, such as lighting changes and motion blur. In order to leverage these attractive attributes of the event cameras, research has been focusing on understanding how to process their unusual output: an asynchronous stream of events. With the majority of existing techniques discretizing the event-stream essentially forming frames of events grouped according to their timestamp, we are still to exploit the power of these cameras. In this spirit, this letter proposes a new, purely event-based corner detector, and a novel corner tracker, demonstrating that it is possible to detect corners and track them directly on the event stream in real time. Evaluation on benchmarking datasets reveals a significant boost in the number of detected corners and the repeatability of such detections over the state of the art even in challenging scenarios with the proposed approach while enabling more than a 4 × speed-up when compared to the most efficient algorithm in the literature. The proposed pipeline detects and tracks corners at a rate of more than 7.5 million events per second, promising great impact in high-speed applications. Mehr anzeigen
Persistenter Link
https://doi.org/10.3929/ethz-b-000277131Publikationsstatus
publishedExterne Links
Zeitschrift / Serie
IEEE Robotics and Automation LettersBand
Seiten / Artikelnummer
Verlag
IEEEThema
Visual tracking; Computer Vision; Robotics; SLAMOrganisationseinheit
09559 - Chli, Margarita (ehemalig) / Chli, Margarita (former)
Förderung
644128 - Collaborative Aerial Robotic Workers (SBFI)
157585 - Collaborative vision-based perception for teams of (aerial) robots (SNF)
Zugehörige Publikationen und Daten
Is cited by: https://doi.org/10.3929/ethz-b-000360434