Metadata only
Date
2014-12Type
- Report
ETH Bibliography
yes
Altmetrics
Abstract
We analyze reduced basis acceleration of recently proposed deterministic Bayesian inversion algorithms for partial differential equations with uncertain distributed parameter, for observation data subject to additive, Gaussian observation noise. Specifically, Bayesian inversion of affine-parametric, linear operator families on possibly high-dimensional parameter spaces. We consider “high-fidelity ” Petrov-Galerkin (PG) discretizations of these countably-parametric operator families: we allow general families of inf-sup stable, PG Finite-Element methods, covering most conforming primal and mixed Finite-Element discretizations of standard problems in mechanics. Reduced basis acceleration of the high-dimensional, parametric forward response maps which need to be numerically solved numerous times in Bayesian inversion is proposed and convergence rate bounds for the error in the Bayesian estimate incurred by the use of reduced bases are derived. As consequence of recent theoretical results on dimension-independent sparsity of parametric responses, and preservation of sparsity for holomorphic-parametric problems, we establish new convergence rates of greedy reduced basis approximations for both, the parametric forward maps as well as for the countably-parametric posterior densities which arise in Bayesian inversion. We show that the convergence rates for the reduced basis approximations of the parametric forward maps as well as of the countably-parametric, deterministic Bayesian posterior densities are free from the curse of dimensionality and depend only on the sparsity of the uncertain input data. In particular, we establish the quadratic convergence of the reduced basis approximation for the posterior densities with respect to that for the parametric forward maps. Numerical experiments for model elliptic, affine-parametric problems in two space dimensions with hundreds of parameters are reported which confirm that the proposed adaptive, deterministic reduced basis algorithms indeed exploit sparsity of both, the parametric forward maps as well as the Bayesian posterior density. Show more
Publication status
publishedJournal / series
Research ReportVolume
Publisher
ETH-ZürichSubject
Parametric Operator Equations; Bayesian Inversion; Reduced Basis; Sparse Grid; A Posteriori Error Estimate; A Priori Error Estimate; Best N-term Convergence; Curse of DimensionalityOrganisational unit
03435 - Schwab, Christoph / Schwab, Christoph
More
Show all metadata
ETH Bibliography
yes
Altmetrics