Speaker
Description
Accurate particle shower simulation remains a critical computational bottleneck in high-energy physics. Traditional Monte Carlo methods, such as Geant4, are computationally prohibitive, while existing machine learning surrogates are often tied to specific detector geometries and require full retraining for each design change or alternative detector configuration.
We present a transfer learning framework for generative calorimeter simulation that enables efficient adaptation across diverse geometries with high data efficiency. Using point cloud representations and pre-training on the International Large Detector (ILD), our approach handles new configurations without re-voxelizing showers for each geometry.
On the CaloChallenge dataset, transfer learning with only 100 target-domain samples achieves a 44% improvement in Wasserstein distance over training from scratch. Parameter-efficient fine-tuning using bias-only adaptation attains competitive performance while updating only 17% of model parameters.
This study provides insight into adaptation mechanisms in particle shower development and establishes a baseline for future progress in point cloud-based calorimeter simulation.