Journées MAS et Journée en l'honneur de Jacques Neveu

31 août - 3 septembre 2010 à Bordeaux

Eric Moulines (élécom Paris Tech)

Adaptive and Interacting Markov chain Monte Carlo algorithms transparents

Adaptive and Interacting Markov chain Monte Carlo algorithms (MCMC) have been recently introduced in the literature.
These novel simulation algorithms are designed to increase the simulation efficiency to sample complex distributions. Motivated by some novel algorithms introduced recently (such as the Adaptive Metropolis (AM) algorithm and the Equi-Energy (EE) sampler), we develop a general methodological and theoretical framework, covering a large number of adaptation algorithms. Within this theoretical framework, we are able to cope with both \emph{internal} and \emph{external} adaptation schemes. In internal scheme, the current value of the parameter is adapted using the past samples of the process. In the external scheme, the adaptation is carried out using a certain number of auxiliary processes which can be interact. We also consider the case where the parameter to adapt is infinite dimensional. This framework covers, among other, the AM and the EE samplers, but much more general variants (coupling the two adaptations techniques) can be considered as well.

We develop a general theory to establish the convergence of these algorithms. For that purpose, we study the convergence of randomly perturbed Markov operators. Using these novel results, we establish both the convergence of the marginal distribution for bounded functions, a strong law of large numbers and a central limit theorem. These results generalize all the previous results obtained earlier. This theory leads to guidelines concerning the design of proper algorithms.

We then illustrate this methodological framework to analyse some ancient and novel adaptive algorithms which prove to be robust and reliable in practice.

The behavior of these algorithms is illustrated on some difficult multimodal scenario in computational genetics.