The European XFEL is a complex machine building on hundreds of subsystems, which might require frequent calibration. The automation of the latter frees operators’ time and potentially increases the exploitation of allotted beamtime.
Three use-cases are shown in this presentation. A first use-case takes advantage of Bayesian Optimization to spatially align an optical laser to a camera. A...
The talk focuses on an on-going effort that aims to predict the x-ray pulse properties from machine settings and available diagnostics via a surrogate model. While still at an early stage, preliminary results already provide useful insights into the correlation between electron bunches and x-ray spectral properties at MHz repetition rates. The goal of the program is not only to provide a...
Virtual diagnostic tools leveraging readily available input data offer a non-invasive way to optimize Free-Electron Lasers (FEL) operation and delivery, especially when limitations with conventional diagnostics arise. This work presents a novel approach using an artificial neural network to online predict photon pulse pointing at MHz level for both soft and hard x-rays. The model input is...
At the LHC, collision data events are produced every 25 ns. To handle these large data streams, the CMS trigger system filters events in real time. The first stage of that system, the Level-1 trigger, is implemented in hardware using FPGAs. We present a novel ML-based anomaly detection algorithm that has been integrated in the Level-1 Trigger and successfully taken data during the 2024 pp...
The likelihood ratio (LR) plays an important role in statistics and many domains of science. The Neyman-Pearson lemma states that it is the most powerful test statistic for simple statistical hypothesis testing problems [1] or binary classification problems. Likelihood ratios are also key to Monte Carlo importance sampling techniques [2]. Unfortunately, in many areas of study the probability...
A normalising flow is a stochastic tool that can be used for generative modelling and reconstruction. While not the lightest models in the toolbox, normalising flows are often very accurate and their bi-directionality can be uniquely advantageous. Literature that guides architecture and design choice for users of these models is focused on non-HEP applications, and optimal results in HEP...
Searches for physics beyond the Standard Model at the Large Hadron collider usually rely on phenomena that affect leptons, photons or jets with high transverse momenta (> 15 GeV).
Alongside these hard physics objects, proton-proton collisions produce a multitude of soft ones, which are known as the underlying event. This work focuses on the search of anomalies among the soft physics objects,...
Ever-increasing collision rates place significant computational stress on the simulation of future experiments in high energy physics. Generative machine learning (ML) models have been found to speed up and augment the most computationally intensive part of the traditional simulation chain: the calorimeter simulation. Many previous studies relied on fixed grid-like data representation of...
Monte Carlo (MC) simulations are essential for collider experiments, enabling comparisons of experimental findings and theoretical predictions. However, these simulations are computationally demanding, and future developments, like increased event rates, are expected to exceed available computational resources. Generative modeling can substantially cut computing costs by augmenting MC...