Speaker
Dr
Kilian Schwarz
(GSI)
Description
The LHC scientific program has led to numerous important physics
results. This would not have been possible without an efficient processing of PetaBytes of data using the Worldwide LHC Computing
Grid (WLCG). In the periods following the accelerator and detector
upgrades, a huge increase in the data rate is expected. In addition,
other big experiments like BELLE-2 and the FAIR collaborations will
also take large amounts of data during the next years. So far the LHC
computing strategy, based on Grid computing as a distribution of data
and CPUs over a few hundred of dedicated sites, has met the challenges.
However, to cope with substantially increased data volumes
and correspondingly higher CPU requirements, new techniques like
cloud computing and the usage of opportunistic resources are necessary.
In parallel a reorganisation of the interplay of the computing
sites is presently addressed by the evolving computing models of the
affected experiments. Recently the Technical Advisory Board of the
WLCG German Tier-1 site GridKa in Karlsruhe organised a meeting
aimed to identify the guidelines for keeping German HEP and Heavy
Ion computing excellent for future requirements. In a follow-up meeting
working groups were launched in order to effectively organise the
work on the above topics. The presentation will address the challenges, the German strategy, and the current status of the work packages.
Topic (ARD or DTS)
DTS
Primary authors
Dr
Kilian Schwarz
(GSI)
Dr
Thomas Kress
(RWTH Aachen University)