Abstract
Parameter tuning is a notoriously time-consuming task in accelerator facilities. As tool for global optimization with noisy evaluations, Bayesian optimization was recently shown to outperform alternative methods. By learning a model of the underlying function using all available data, the next evaluation can be chosen carefully to find the optimum with as few steps as possible and without violating any safety constraints. However, the per-step computation time increases significantly with the number of parameters and the generality of the approach can lead to slow convergence on functions that are easier to optimize. To overcome these limitations, we divide the global problem into sequential subproblems that can be solved efficiently using safe Bayesian optimization. This allows us to trade off local and global convergence and to adapt to additional structure in the objective function. Further, we provide slice-plots of the function as user feedback during the optimization. We showcase how we use our algorithm to tune up the FEL output of SwissFEL with up to 40 parameters simultaneously, and reach convergence within reasonable tuning times in the order of 30 minutes (< 2000 steps). Show more
Permanent link
https://doi.org/10.3929/ethz-b-000385955Publication status
publishedExternal links
Editor
Book title
FEL2019, Proceedings of the 39th International Free-Electron Laser ConferencePages / Article No.
Publisher
JACoW PublishingEvent
Organisational unit
03908 - Krause, Andreas / Krause, Andreas
Funding
159557 - Explore-exploit with Gaussian Processes under Complex Constraints (SNF)
167212 - Scaling Up by Scaling Down: Big ML via Small Coresets (SNF)
815943 - Reliable Data-Driven Decision Making in Cyber-Physical Systems (EC)
Notes
Conference lecture held on August 29, 2019More
Show all metadata
ETH Bibliography
yes
Altmetrics