Speaker
Dr
Oliver Schulz
(MPI Munich)
Description
The GERDA experiment is designed to search for neutrinoless double beta decay of Ge-76, using an array of isotopically enriched high-purity germanium detectors.
While GERDA is built using state-of-the-art low-background construction techniques, these alone cannot provide the ultra-low background levels required for and achieved by the experiment. We present the data analysis and machine learning (ML) techniques currently employed by the GERDA collaboration to substantially reduce background levels during offline data processing. We also present promising novel approaches for future improvements that may benefit both GERDA and next-generation Ge-76 double beta decay experiments.