Open access
Author
Date
2020-01-31Type
- Master Thesis
ETH Bibliography
yes
Altmetrics
Abstract
Deep neural networks and the ENO procedure are both efficient frameworks for approximating rough functions. We prove that at any order, the stencil shifts of the ENO and ENO-SR interpolation procedures can be exactly obtained using a deep ReLU neural network. In addition, we construct and provide error bounds for ReLU neural networks that directly approximate the output of the ENO and ENO- SR interpolation procedures. This surprising fact enables the transfer of several desirable properties of the ENO procedure to deep neural networks, including its high-order accuracy at approximating Lipschitz functions. Numerical tests for the resulting neural networks show excellent performance for interpolating rough functions, data compression and approximating solutions of nonlinear conservation laws. Show more
Permanent link
https://doi.org/10.3929/ethz-b-000397533Publication status
publishedContributors
Examiner: Mishra, Siddhartha
Publisher
ETH ZurichSubject
Numerical analysis; Interpolation; Deep learning; ENO reconstruction; ReLUOrganisational unit
03851 - Mishra, Siddhartha / Mishra, Siddhartha
02501 - Seminar für Angewandte Mathematik / Seminar for Applied Mathematics
More
Show all metadata
ETH Bibliography
yes
Altmetrics