Open access
Date
2018-10Type
- Journal Article
Abstract
The recent emergence of bioinspired event cameras has opened up exciting new possibilities in high-frequency tracking, bringing robustness to common problems in traditional vision, such as lighting changes and motion blur. In order to leverage these attractive attributes of the event cameras, research has been focusing on understanding how to process their unusual output: an asynchronous stream of events. With the majority of existing techniques discretizing the event-stream essentially forming frames of events grouped according to their timestamp, we are still to exploit the power of these cameras. In this spirit, this letter proposes a new, purely event-based corner detector, and a novel corner tracker, demonstrating that it is possible to detect corners and track them directly on the event stream in real time. Evaluation on benchmarking datasets reveals a significant boost in the number of detected corners and the repeatability of such detections over the state of the art even in challenging scenarios with the proposed approach while enabling more than a 4 × speed-up when compared to the most efficient algorithm in the literature. The proposed pipeline detects and tracks corners at a rate of more than 7.5 million events per second, promising great impact in high-speed applications. Show more
Permanent link
https://doi.org/10.3929/ethz-b-000277131Publication status
publishedExternal links
Journal / series
IEEE Robotics and Automation LettersVolume
Pages / Article No.
Publisher
IEEESubject
Visual tracking; Computer Vision; Robotics; SLAMOrganisational unit
09559 - Chli, Margarita (ehemalig) / Chli, Margarita (former)
Funding
644128 - Collaborative Aerial Robotic Workers (SBFI)
157585 - Collaborative vision-based perception for teams of (aerial) robots (SNF)
Related publications and datasets
Is cited by: https://doi.org/10.3929/ethz-b-000360434
More
Show all metadata