pecial focus in this topic should be on the images produced by the fast 2D detectors, alongside auxiliary data needed from e.g. beam monitoring or instrument monitoring and how this is organized and configured. For the 2D detectors, data is by default decided as set of calibrated images, i.e in units of photons per pixel per pulse. Data will be ordered in C-ordering. Zero-suppression according to noise-thresholds is foreseen. Additionally, data may already be converted to event lists by the calibration and correction routines, depending on the observation type: i.e it is feasible to do this if speckles are foreseen to be observed but less feasible for ring-type diffraction patterns.
The XFEL data formats use HDF5 for storage. Archival and streaming formats are foreseen, both of which will be able to handle data expressible as Karabo-hashes - this includes images and matrices of arbitrary rank. From within Karabo an I/O interface will be provided. Is there a need to foresee additional API interfaces for legacy software? How can it be guaranteed that data acquired through such interfaces follows XFEL data format standards, e.g. through documentation or through the API?
Concurrency: Karabo will handle message/data passing between individual device servers. Scalable topologies, and scatter/gather techniques are supported. What additional concurrency frameworks should be foreseen as being usable on a device level, e.g. ipyparallel, ØMQ? How could existing software best be integrated, using e.g. the aforementioned additional frameworks, into a concurrent Karabo environment? The goal of this topic would be to provide a list of concurrency scenarios which may occur, and ideally input to preparing a best-practices document on how to implement these.
Discussion on identifying common, cross-instrument requirements and optimizations thereof