Accurate simulation of the interaction of particles with the detector materials is of
utmost importance for the success of modern particle physics. Software libraries like
GEANT4 are tools that already allow the modeling of physical processes inside detectors
with high precision. The downside of this method is its computational cost in terms of
time.
Recent developments in generative...
Detector simulation is a key cornerstone of modern high energy physics. Traditional simulation tools are reliant upon Monte Carlo methods, which consume significant computational resources and are projected to be a major bottleneck at the high luminosity stage of the LHC and for future colliders. Calorimeter shower simulation has been a focus of fast simulation efforts, as it is particularly...
We present NPointFunctions, which was developed in order to obtain any desired one-loop amplitudes for an arbitrary BSM model.
The tool aims to be customizable, modular and extensible for additional process- or amplitude- dependent contributions.
It relies on the SARAH-generated output used with FeynArts/FormCalc packages, interfaced in an appropriate way.
Currently, several LFV processes...
Highly precise simulations of elementary particles interaction and processes are fundamental to accurately reproduce and interpret the experimental results in High Energy Physics (HEP) detectors and to correctly reconstruct the particle flows. Today, detector simulations typically rely on Monte Carlo-based methods which are extremely demanding in terms of computing resources. The need for...
The pixel vertex detector (PXD) is the newest and the most sensitive subdetector at the Belle II. Data from the PXD and other sensors allow us to reconstruct particle tracks and decay vertices. The effect of background processes on track reconstruction is simulated by adding measured or simulated background hit patterns to the hits produced by simulated signal particles which originates from...
Imaging capabilities of highly granular calorimeters allow to study in detail the inner structure of hadronic showers. Reconstruction of the particle composition and properties of secondary showers in each hadronic cascade brings additional information that can be used in diferent applications. This contribution presents the graph neural network based reconstruction of electromagnetic...
In this talk I present a method to reconstruct the kinematics of neutral-current deep inelastic scattering (DIS) using a deep neural network (DNN). Unlike traditional methods, it exploits the full kinematic information of both the scattered electron and the hadronic-final state, and it accounts for QED radiation by identifying events with radiated photons and event-level momentum imbalance....
Despite continuous efforts by the LHC physics program as well as other experiments to conduct searches for physics beyond the standard model, no evidence has been found so far. A major disadvantage of many current searches is their reliance on specific signal and background models. Since it is impossible to cover all possible models and phase space regions with a dedicated search, the...
The associated production of a bb¯ pair with a Higgs boson could provide an important probe to both the size and the phase of the bottom-quark Yukawa coupling, yb. However, the signal is shrouded by several background processes including the irreducible Zh,Z→bb¯ background. We show that the analysis of kinematic shapes provides us with a concrete prescription for separating the yb-sensitive...
Dynamically and opportunistically extending the compute resources of a HTC-Cluster (ATLAS-BFG)
with compute resources of a HPC-Cluster (NEMO) allows one to increase the computational capabilities
based on the demand of users and the availability of resources and as such leads to an efficient use of
resources across boundaries of clusters and disciplines. This is completely transparent,...
A file caching setup to access ATLAS data was deployed for the NEMO HPC cluster in Freiburg, using
an XRootD proxy server running in forwarding mode. This setup allows running HEP workflows without
the need to operate a large local data storage. Several performance tests were carried out to measure any
potential overhead caused by the caching setup with respect to direct, non-cached data access.
The lepton–proton collisions produced at the HERA collider represent a unique high energy physics data set. A number of years after the end of collisions, the data collected by the H1 experiment, as well as the simulated events and all software needed for reconstruction, simulation and data analysis were migrated into a preserved operational mode at DESY. A recent modernisation of the H1...
After the long shutdown preparing the CMS detector for Run 3, the tracker alignment constants, namely position, orientation, and curvature of each of the 15148 tracker modules that compose the tracking system need to be derived again with a high precision in order to ensure a good performance of the detector for physics analysis. This process constitutes a major computational challenge due to...
Extensions of the two higgs doublet models with a singlet scalar can easily accommodate all current experimental constraints and are highly motivated candidates for Beyond Standard Model Physics. It can successfully provide a dark matter candidate, explain baryogenesis and provide gravitational wave signals. In this work, we focus on the dark matter phenomenology of the two higgs doublet model...
It is well-known that e+e− colliders have the power to with certainty exclude or discover any SUSY model that predicts a Next to lightest SUSY particle (an NLSP) that has a mass up to slightly below the half the centre-of-mass energy of the collider. Here, we present an estimation of the power of present and future hadron colliders to extend the reach of searches for SUSY, with particular...