Speaker
Description
Large language models (LLMs) have demonstrated formidable capabilities in analyzing and synthesizing natural language text. This technology presents an opportunity to extract and connect knowledge from the expansive corpus of textual and visual artifacts accumulated over decades of research at the German Electron Synchrotron (DESY).
This talk will detail current efforts within the DESY MCS4 to distill insights from various DESY knowledge resources to better interpret the collective wisdom generated throughout the center's extensive operational history.
Initial investigations focus on deploying LLMs to process materials from particle accelerator conferences to model semantic relationships within this discipline. We further examine the application of basic LLM architectures for analyzing sequences of log data from control system watchdog nodes, aimed at detecting anomalies. Finally, we share efforts to develop supervised LLMs that ingest specific sources of DESY documentation to automatically generate knowledge summaries, which harness a massive potential in various domains in DESY.