Open access
Autor(in)
Alle anzeigen
Datum
2019-11Typ
- Conference Paper
ETH Bibliographie
yes
Altmetrics
Abstract
Parameter tuning is a notoriously time-consuming task in accelerator facilities. As tool for global optimization with noisy evaluations, Bayesian optimization was recently shown to outperform alternative methods. By learning a model of the underlying function using all available data, the next evaluation can be chosen carefully to find the optimum with as few steps as possible and without violating any safety constraints. However, the per-step computation time increases significantly with the number of parameters and the generality of the approach can lead to slow convergence on functions that are easier to optimize. To overcome these limitations, we divide the global problem into sequential subproblems that can be solved efficiently using safe Bayesian optimization. This allows us to trade off local and global convergence and to adapt to additional structure in the objective function. Further, we provide slice-plots of the function as user feedback during the optimization. We showcase how we use our algorithm to tune up the FEL output of SwissFEL with up to 40 parameters simultaneously, and reach convergence within reasonable tuning times in the order of 30 minutes (< 2000 steps). Mehr anzeigen
Persistenter Link
https://doi.org/10.3929/ethz-b-000385955Publikationsstatus
publishedExterne Links
Herausgeber(in)
Buchtitel
FEL2019, Proceedings of the 39th International Free-Electron Laser ConferenceSeiten / Artikelnummer
Verlag
JACoW PublishingKonferenz
Organisationseinheit
03908 - Krause, Andreas / Krause, Andreas
Förderung
159557 - Explore-exploit with Gaussian Processes under Complex Constraints (SNF)
167212 - Scaling Up by Scaling Down: Big ML via Small Coresets (SNF)
815943 - Reliable Data-Driven Decision Making in Cyber-Physical Systems (EC)
Anmerkungen
Conference lecture held on August 29, 2019ETH Bibliographie
yes
Altmetrics