Show simple item record

dc.contributor.author
Alzugaray, Ignacio
dc.contributor.supervisor
Chli, Margarita
dc.contributor.supervisor
Davison, Andrew
dc.contributor.supervisor
Scaramuzza, Davide
dc.contributor.supervisor
Kneip, Laurent
dc.date.accessioned
2022-04-08T11:37:59Z
dc.date.available
2022-04-08T10:12:34Z
dc.date.available
2022-04-08T10:46:34Z
dc.date.available
2022-04-08T11:15:53Z
dc.date.available
2022-04-08T11:37:59Z
dc.date.issued
2022
dc.identifier.uri
http://hdl.handle.net/20.500.11850/541700
dc.identifier.doi
10.3929/ethz-b-000541700
dc.description.abstract
Traditional frame-based cameras have become the de facto sensor of choice for a multitude of applications employing Computer Vision due to their compactness, low cost, ubiquity, and ability to provide information-rich exteroceptive measurements. Despite their dominance in the field, these sensors exhibit limitations in common, real-world scenarios where detrimental effects, such as motion blur during high-speed motion or over-/underexposure in scenes with poor illumination, are prevalent. Challenging the dominance of traditional cameras, the recent emergence of bioinspired event cameras has opened up exciting research possibilities for robust perception due to their high-speed sensing, High-Dynamic-Range capabilities, and low power consumption. Despite their promising characteristics, event cameras present numerous challenges due to their unique output: a sparse and asynchronous stream of events, only capturing incremental perceptual changes at individual pixels. This radically different sensing modality renders most of the traditional Computer Vision algorithms incompatible without substantial prior adaptation, as they are initially devised for processing sequences of images captured at fixed frame-rate. Consequently, the bulk of existing event-based algorithms in the literature have opted to discretize the event stream into batches and process them sequentially, effectively reverting to frame-like representations in an attempt to mimic the processing of image sequences from traditional sensors. Such event-batching algorithms have demonstrably outperformed other alternative frame-based algorithms in scenarios where the quality of conventional intensity images is severely compromised, unveiling the inherent potential of these new sensors and popularizing them. To date, however, many newly designed event-based algorithms still rely on a contrived discretization of the event stream for its processing, suggesting that the full potential of event cameras is yet to be harnessed by processing their output more naturally. This dissertation departs from the mere adaptation of traditional frame-based approaches and advocates instead for the development of new algorithms integrally designed for event cameras to fully exploit their advantageous characteristics. In particular, the focus of this thesis lies on describing a series of novel strategies and algorithms that operate in a purely event-driven fashion, \ie processing each event as soon as it gets generated without any intermediate buffering of events into arbitrary batches and thus avoiding any additional latency in their processing. Such event-driven processes present additional challenges compared to their simpler event-batching counterparts, which, in turn, can largely be attributed to the requirement to produce reliable results at event-rate, entailing significant practical implications for their deployment in real-world applications. The body of this thesis addresses the design of event-driven algorithms for efficient and asynchronous feature detection and tracking with event cameras, covering alongside crucial elements on pattern recognition and data association for this emerging sensing modality. In particular, a significant portion of this thesis is devoted to the study of visual corners for event cameras, leading to the design of innovative event-driven approaches for their detection and tracking as corner-events. Moreover, the presented research also investigates the use of generic patch-based features and their event-driven tracking for the efficient retrieval of high-quality feature tracks. All the developed algorithms in this thesis serve as crucial stepping stones towards a completely event-driven, feature-based Simultaneous Localization And Mapping (SLAM) pipeline. This dissertation extends upon established concepts from state-of-the-art, event-driven methods and further explores the limits of the event-driven paradigm in realistic monocular setups. While the presented approaches solely rely on event-data, the gained insights are seminal to future investigations targeting the combination of event-based vision with other, complementary sensing modalities. The research conducted here paves the way towards a new family of event-driven algorithms that operate efficiently, robustly, and in a scalable manner, envisioning a potential paradigm shift in event-based Computer Vision.
en_US
dc.format
application/pdf
en_US
dc.language.iso
en
en_US
dc.publisher
ETH Zurich
en_US
dc.rights.uri
http://rightsstatements.org/page/InC-NC/1.0/
dc.subject
Computer Vision
en_US
dc.subject
SLAM
en_US
dc.subject
Event camera
en_US
dc.subject
Robotics
en_US
dc.subject
Asynchronous processing
en_US
dc.subject
Feature detection
en_US
dc.subject
Feature tracking
en_US
dc.title
Event-driven Feature Detection and Tracking for Visual SLAM
en_US
dc.type
Doctoral Thesis
dc.rights.license
In Copyright - Non-Commercial Use Permitted
dc.date.published
2022-04-08
ethz.size
116 p.
en_US
ethz.code.ddc
DDC - DDC::0 - Computer science, information & general works::004 - Data processing, computer science
en_US
ethz.identifier.diss
28264
en_US
ethz.publication.place
Zurich
en_US
ethz.publication.status
published
en_US
ethz.leitzahl
ETH Zürich::00002 - ETH Zürich::00012 - Lehre und Forschung::00007 - Departemente::02130 - Dep. Maschinenbau und Verfahrenstechnik / Dep. of Mechanical and Process Eng.::02620 - Inst. f. Robotik u. Intelligente Systeme / Inst. Robotics and Intelligent Systems::09559 - Chli, Margarita (ehemalig) / Chli, Margarita (former)
en_US
ethz.date.deposited
2022-04-08T10:12:40Z
ethz.source
FORM
ethz.eth
yes
en_US
ethz.availability
Open access
en_US
ethz.rosetta.installDate
2022-04-08T10:46:45Z
ethz.rosetta.lastUpdated
2023-02-07T00:46:06Z
ethz.rosetta.versionExported
true
ethz.COinS
ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.atitle=Event-driven%20Feature%20Detection%20and%20Tracking%20for%20Visual%20SLAM&rft.date=2022&rft.au=Alzugaray,%20Ignacio&rft.genre=unknown&rft.btitle=Event-driven%20Feature%20Detection%20and%20Tracking%20for%20Visual%20SLAM
 Search print copy at ETH Library

Files in this item

Thumbnail

Publication type

Show simple item record