Monte Carlo Methods in Advanced Statistics Applications and Data Analysis
from
Monday 18 November 2013 (08:00)
to
Friday 22 November 2013 (18:00)
Monday 18 November 2013
09:00
Registration
Registration
09:00 - 10:30
10:30
Basics 1: Basis of statistics, probability etc.
-
Allen Caldwell
(
Max Planck Institute
)
Basics 1: Basis of statistics, probability etc.
Allen Caldwell
(
Max Planck Institute
)
10:30 - 12:30
12:30
Lunch break
Lunch break
12:30 - 14:00
14:00
Basics 2: Random numbers, distributions etc.
-
Allen Caldwell
(
Max Planck Institute
)
Basics 2: Random numbers, distributions etc.
Allen Caldwell
(
Max Planck Institute
)
14:00 - 15:30
15:30
Coffee break
Coffee break
15:30 - 16:00
16:00
Basics 3: Logic, information and Bayesian reasoning (lecture)
-
Torsten Ensslin
(
MPA
)
Basics 3: Logic, information and Bayesian reasoning (lecture)
Torsten Ensslin
(
MPA
)
16:00 - 18:00
18:30
Welcome reception
Welcome reception
18:30 - 20:00
Tuesday 19 November 2013
09:00
Basics 4: Information field theory - from data to images (lecture)
-
Torsten Ensslin
(
MPA
)
Basics 4: Information field theory - from data to images (lecture)
Torsten Ensslin
(
MPA
)
09:00 - 10:30
The problem of reconstrucing an image or a function from data is generally ill-posed. The desired signal has an infinite number of degrees of freedom whereas the data is only providing a finite number of constraints. Additional statistical information and other knowledge has to be used to regularize the problem. Information field theory permits us to formulate signal inference problems rigorously using probabilistic language to combine data and knowledge. It helps us to exploit existing methods developed for field theories to derive optimal reconstruction algorithms. In this course, an introduction to the basic principles of information field theory will be given and illustrate by concrete examples from astrophysical applications.
10:30
Coffee break
Coffee break
10:30 - 11:00
11:00
NIFTY: Numerical information field theory
-
Marco Selig
(
MPA Garching
)
NIFTY: Numerical information field theory
Marco Selig
(
MPA Garching
)
11:00 - 12:30
This Tutorial introduces NIFTY, "Numerical Information Field Theory", which allows a user an abstract formulation and programming of SIGNAL inference AND IMAGE RECONSTRUCTION algorithms. NIFTY is a versatile Python library designed to enable the development of signal inference algorithms that operate regardless of the underlying spatial grid and its resolution. The Tutorial covers the simulation of mock data from Gaussian random processes and a Wiener filter reconstruction of the underlying signal field from this data set. Using NIFTY, this filter can be applied on a variety of spaces; e.g., point sets, n-dimensional regular grids, spherical spaces, their harmonic counterparts, and product spaces constructed as combinations of those.
12:30
Lunch break
Lunch break
12:30 - 14:00
14:00
Bayesian mixture modelling
-
Fabrizia Guglielmetti
(
MPE Garching
)
Bayesian mixture modelling
Fabrizia Guglielmetti
(
MPE Garching
)
14:00 - 15:45
A method to solve the long-lasting problem of disentanglement of the background from the sources is given by Bayesian mixture modelling (Guglielmetti F., et al., 2009, MNRAS, 396,165). The technique employs a joint estimate of the background and detection of the sources in astronomical images. Bayesian probability theory is applied to gain insight into the coexistence of background and sources through a probabilistic two-component mixture model. Uncertainties of the background and source signals are consistently provided. Background variations are properly modelled and sources are detected independent of their shape. No background subtraction is needed for the detection of sources. Poisson statistics is rigorously applied throughout the whole algorithm. The technique is general and applicable to count detectors. Practical demonstrations of the method will be given through simulated data sets and data observed in the X-ray part of the electromagnetic spectrum from ROSAT and Chandra satellites.
15:45
Coffee break
Coffee break
15:45 - 16:15
16:15
Multivariate analysis
-
Balazs Kegl
(
LAL, Orsay
)
Multivariate analysis
Balazs Kegl
(
LAL, Orsay
)
16:15 - 18:00
Wednesday 20 November 2013
09:00
Markov chain Monte Carlo 1
-
Remi Bardenet
(
Oxford
)
Markov chain Monte Carlo 1
Remi Bardenet
(
Oxford
)
09:00 - 10:30
10:30
Coffee break
Coffee break
10:30 - 11:00
11:00
Markov chain Monte Carlo 2
-
Ralf Ulrich
(
KIT
)
Remi Bardenet
(
Oxford
)
Markov chain Monte Carlo 2
Ralf Ulrich
(
KIT
)
Remi Bardenet
(
Oxford
)
11:00 - 12:30
12:30
Lunch break
Lunch break
12:30 - 14:00
14:00
BAT - a complex Markov chain Monte Carlo application
-
Kevin Kroeninger
(
University of Goettingen
)
BAT - a complex Markov chain Monte Carlo application
Kevin Kroeninger
(
University of Goettingen
)
14:00 - 16:00
BAT - a complex Markov chain Monte Carlo application The tutorial will give an introduction to the Bayesian Analysis Toolkit (BAT), a C++ tool for Bayesian inference. The software is based on algorithms for sampling, optimization and integration where the key algorithm is Markov Chain Monte Carlo. Interfaces to existing software tools exists, e.g., the ROOT implementation of Minuit, and the Cuba library. A simple physics example will be discussed and formulated as a statistical model in BAT. The first steps will include the calculation of marginal distributions and uncertainty propagation. The example will also be used to explain the basic functionalities of BAT.
16:00
Coffe break
Coffe break
16:00 - 16:30
16:30
The STAN package: Bayesian Inference based on Hamiltonian Monte Carlo
-
Michael Betancourt
The STAN package: Bayesian Inference based on Hamiltonian Monte Carlo
Michael Betancourt
16:30 - 18:00
Thursday 21 November 2013
09:00
Basic sampling methods, convergence, variance reduction - and connections to MC event generators
-
Stefan Gieseke
(
KIT
)
Basic sampling methods, convergence, variance reduction - and connections to MC event generators
Stefan Gieseke
(
KIT
)
09:00 - 10:30
We consider Monte Carlo methods specific to the use in Monte Carlo event generators. After an introduction to Monte Carlo sampling or integration we will discuss some methods of variance reduction with phase space integration as application in mind. Finally we briefly discuss Multi Channel integration as the key to the integration of multi body final state matrix element.
10:30
Coffee break
Coffee break
10:30 - 11:00
11:00
Exercises on MC sampling
-
Allen Caldwell
(
Max Planck Institute
)
Exercises on MC sampling
Allen Caldwell
(
Max Planck Institute
)
11:00 - 12:30
12:30
Lunch break
Lunch break
12:30 - 14:00
14:00
Nested sampling
-
Udo v. Toussaint
(
IPP Garching
)
Nested sampling
Udo v. Toussaint
(
IPP Garching
)
14:00 - 16:00
16:00
Coffee break
Coffee break
16:00 - 16:30
16:30
Nested sampling using PyMultiNest
-
Johannes Buchner
(
MPE Garching
)
Nested sampling using PyMultiNest
Johannes Buchner
(
MPE Garching
)
16:30 - 18:00
18:30
School dinner
School dinner
18:30 - 23:00
Friday 22 November 2013
09:00
Population MC 1
-
Frederic Beaujean
(
MPI Munich
)
Population MC 1
Frederic Beaujean
(
MPI Munich
)
09:00 - 10:30
Adaptive importance sampling, or population Monte Carlo (PMC), is a powerful technique to sample from and integrate over complicated distributions that may include degeneracies and multiple modes in up to roughly 40 dimensions. PMC is best for tough problems as the costly evaluation of the target distribution can be massively parallelized. Based on a simplified global fit for new physics, the individual parts of the algorithm ranging from the initialization over proposal-function updates to the final results are presented step by step in a hands-on and visual fashion. Only basic knowledge of C++ is required in order to modify the given source-code examples for a more rewarding learning experience.
10:30
Coffee break
Coffee break
10:30 - 11:00
11:00
Population MC 2
-
Frederic Beaujean
(
MPI Munich
)
Population MC 2
Frederic Beaujean
(
MPI Munich
)
11:00 - 12:30
12:30
Lunch break
Lunch break
12:30 - 14:00
14:00
Q&A session
Q&A session
14:00 - 15:30