Show simple item record

dc.contributor.author
Neil, Daniel
dc.contributor.supervisor
Liu, Shih-Chii
dc.contributor.supervisor
Delbruck, Tobi
dc.contributor.supervisor
Lee, Daniel
dc.contributor.supervisor
Martin, Kevan A.C.
dc.date.accessioned
2017-07-06T12:28:11Z
dc.date.available
2017-07-06T11:42:20Z
dc.date.available
2017-07-06T12:28:11Z
dc.date.issued
2017
dc.identifier.uri
http://hdl.handle.net/20.500.11850/168865
dc.identifier.doi
10.3929/ethz-b-000168865
dc.description.abstract
Event-based sensors, built with biological inspiration, differ greatly from traditional sensor types. A standard vision sensor uses a pixel array to produce a frame containing the light intensity at every pixel whenever the sensor is sampled; a standard audio sensor produces a waveform of sound amplitude over time. Event-based sensors, on the other hand, are typically substantially sparser in their output, producing output events that occur upon informative changes in the scene, usually with low latency and accurate timing, and are data-driven rather than sampled. The outputs produced by these novel sensor types differ radically from traditional sensors. Unfortunately, these differences make it hard to apply standard data analysis techniques to event-based data, despite the advanced state of computational techniques for image understanding and acoustic processing. Machine learning especially has made great strides in recent years towards scene understanding, and particularly in the area of deep learning. The goal of this thesis is to study how to make use of these novel sensors to draw from the state-of-the-art in machine learning while maintaining the advantages of event-based sensors. This thesis takes the view that frame-based, traditional data has limited the scope of discovery for new kinds of machine learning algorithms. While machine learning algorithms have reached great success, their achievements pale in comparison to biological reasoning, and perhaps this arises from the fundamental assumptions about what is processed in addition to how. That is, by relaxing expectations on the kinds of data that will be processed, perhaps even better algorithms can be discovered that not only work with biologically-inspired event-based sensors but also outperform traditional machine learning algorithms. This thesis is studied at multiple levels of abstraction. In Chapter 2, custom hardware platforms are introduced that prototype an existing machine learning algorithm in hardware. That work aims to ensure that the advantages of both state-of-the-art machine learning and the novel sensor types are maintained at the most fundamental hardware level and to understand the limitations of the algorithms better. Indeed, this revealed that the most significant bottleneck when combining both is the accuracy loss compared to traditional machine learning algorithms, and motivates the work in Chapter 3 that dramatically increases the accuracy of event-driven neural networks for fixed, unchanging scenes (e.g., image analysis, perhaps the most well-studied problem in deep learning currently). With that primary limitation addressed, Chapter 4 explores advantages that are unavailable to traditional deep learning but are available to event-driven deep networks. Chapter 5 forms perhaps the key contribution of this thesis by introducing a novel algorithm, Phased LSTM, that natively works with event-driven sensors observing dynamic and changing scenes. Indeed, as hypothesized above, Phased LSTM offers significant advantages over traditional deep neural networks, both for event-driven inputs and for standard frame-based inputs. Chapter 6 investigates the source of these advantages to identify if the model is sufficiently simple and advantageous. Finally, an observation made in the development of Phased LSTM motivates examining a principle of event-based sensing within computation as well, explored in Chapter 7, and demonstrates the significant computational speedups that can result when sensor principles are also applied to computation. Overall, this thesis introduces hardware implementations and algorithms that use inspiration from deep learning and the advantages of event-based sensors to add intelligence to platforms to achieve a new generation of lower-power, faster-response, and more accurate systems.
en_US
dc.format
application/pdf
dc.language.iso
en
en_US
dc.publisher
ETH Zurich
en_US
dc.rights.uri
http://rightsstatements.org/page/InC-NC/1.0/
dc.subject
Deep Neural Networks
en_US
dc.subject
Event-driven sensors
en_US
dc.subject
Deep neural networks (DNNs)
en_US
dc.subject
Spiking deep neural networks
en_US
dc.subject
Recurrent Neural Networks
en_US
dc.subject
Convolutional neural networks
en_US
dc.title
Deep Neural Networks and Hardware Systems for Event-driven Data
en_US
dc.type
Doctoral Thesis
dc.rights.license
In Copyright - Non-Commercial Use Permitted
dc.date.published
2017-07-06
ethz.size
154 p.
en_US
ethz.code.ddc
DDC - DDC::0 - Computer science, information & general works::004 - Data processing, computer science
ethz.identifier.diss
24392
en_US
ethz.publication.place
Zurich
en_US
ethz.publication.status
published
en_US
ethz.leitzahl
ETH Zürich::00002 - ETH Zürich::00012 - Lehre und Forschung::00007 - Departemente::02140 - Departement Informationstechnologie und Elektrotechnik / Department of Information Technology and Electrical Engineering::02533 - Institut für Neuroinformatik (INI) / Institute of Neuroinformatics (INI)::03454 - Martin, Kevan A.C. (emeritus)
en_US
ethz.leitzahl.certified
ETH Zürich::00002 - ETH Zürich::00012 - Lehre und Forschung::00007 - Departemente::02140 - Departement Informationstechnologie und Elektrotechnik / Department of Information Technology and Electrical Engineering::02533 - Institut für Neuroinformatik (INI) / Institute of Neuroinformatics (INI)::03454 - Martin, Kevan A.C. (emeritus)
en_US
ethz.date.deposited
2017-07-06T11:42:22Z
ethz.source
FORM
ethz.eth
yes
en_US
ethz.availability
Open access
en_US
ethz.rosetta.installDate
2017-07-06T12:28:24Z
ethz.rosetta.lastUpdated
2022-03-28T17:09:01Z
ethz.rosetta.versionExported
true
ethz.COinS
ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.atitle=Deep%20Neural%20Networks%20and%20Hardware%20Systems%20for%20Event-driven%20Data&rft.date=2017&rft.au=Neil,%20Daniel&rft.genre=unknown&rft.btitle=Deep%20Neural%20Networks%20and%20Hardware%20Systems%20for%20Event-driven%20Data
 Search print copy at ETH Library

Files in this item

Thumbnail

Publication type

Show simple item record