Speaker
Description
Over the last decades of the developments in radio astronomy, the data rates and data complexity increased many times over. It is connected to multiple factors, among which the fact that the main focus of the radio astronomical research was shifted towards time-varying and transient signals. In addition to that, the technological development of the variety of ground- and space-based telecommunication systems contributes to the intensification of the radio-interference background. All of these factors together lead to the current situation of impossibility to record all the signals during the observations to process them afterwards. Amount of data exceeds all feasible storage volumes. Unavoidably, some of the signals or their parts should be rejected in some semi- or fully-automated procedures and be lost. It seems reasonable to adopt ideas from the information theory to overcome some issues of the data loss and identify which signals should be recorded to have successful observations. Our main goal is to develop new approaches and techniques that will be able to identify "amount" of information in the data stream of radio telescopes and help to identify how we can record the least amount of data while keeping the most amount of information. On the current stage of our project we are working towards two directions. First, we are looking for mathematical connections between different methods used for signal detection. Second, we are investigating different ways to identify global and local features of signal and to compute the information content of the data stream based on this features. In this contribution, I will show the general picture of the signal detection and compression from the statistical point of view, and the approaches to computing the information content in this framework.
Type of submission | Talk |
---|