Open access
Autor(in)
Datum
2024Typ
- Doctoral Thesis
ETH Bibliographie
yes
Altmetrics
Abstract
Stochastic approximation methods are a class of iterative algorithms that play an essential role in applications involving noisy and incomplete observations.
Rooted in the seminal works of Robbins and Monro (1951) and Kiefer and Wolfowitz (1952), this class of iterative processes drive a system towards a specified objective despite noise and bias.
Stochastic approximation methods have become increasingly important in fields like statistics and machine learning due to their resilience to noise and low computational costs. They are especially useful in training neural networks and adaptive learning systems. The rise of big data has further heightened the significance of stochastic approximation, as it necessitates efficient and scalable algorithms for real-time decision-making.
This thesis embarks on a renewed exploration of stochastic approximation through a contemporary lens, focusing on its dynamics and long-term behavior in non-Euclidean spaces.
Inspired by the dynamical systems approach introduced by Benaïm and Hirsch in the 1990s, this work tackles crucial questions surrounding the convergence of the iterates of a stochastic approximation algorithm, the characteristics of its limits, and the desirability of these limits.
The thesis aims to provide profound theoretical insights into the stability and convergence of these algorithms amidst the challenges posed by their non-Euclidean structure and real-world conditions marked by noise and incomplete information.
This work examines stochastic approximation within three unconventional contexts where traditional methods are inadequate.
Firstly, we delve into stochastic approximation on Riemannian manifolds, analyzing problems related to Riemannian optimization and strategic games.
This non-Euclidean setting introduces complexities requiring novel approaches for proving convergence.
Secondly, the thesis investigates the discretization of stochastic differential equations by lifting the problem to the Wasserstein space.
This is particularly relevant to sampling algorithms based on the Langevin diffusion, which are essential in Bayesian learning and generative modeling.
We demonstrate how our findings can enhance the reliability and effectiveness of these sampling algorithms, ultimately supporting more robust Bayesian inference and generative processes.
Lastly, the thesis explores algorithms lacking a step-size parameter, focusing on the Sinkhorn algorithm for solving the entropic optimal transport and Schrödinger bridge problems.
By proposing a variant with a step-size parameter and examining its continuous-time limit, we offer a stochastic approximation analysis that ensures convergence of this variant of the Sinkhorn algorithm even under noise and bias.
This thesis represents a harmonious blend of classical stochastic approximation and its extended applications in non-Euclidean setups.
By integrating different areas of mathematics, we offer a thorough analysis that enriches the theoretical foundation of stochastic approximation and guarantees the robustness of these algorithms across a diverse array of scenarios. Mehr anzeigen
Persistenter Link
https://doi.org/10.3929/ethz-b-000696217Publikationsstatus
publishedExterne Links
Printexemplar via ETH-Bibliothek suchen
Beteiligte
Referent: Krause, R. Andreas
Referent: Bartlett, Peter L.
Referent: Hassani, Hamed S.
Referent: Mertikopoulos, Panayotis
Verlag
ETH ZurichThema
Stochastic Approximation;Dynamical Systems;Riemannian Geometry;Wasserstein Spaces;Entropic Optimal Transport;Sinkhorn Algorithm;Schrödinger BridgesOrganisationseinheit
03908 - Krause, Andreas / Krause, Andreas
ETH Bibliographie
yes
Altmetrics