Skip to main content
Kent Academic Repository

Adaptive independent sticky MCMC algorithms

Leisen, Fabrizio, Casarin, Roberto, Martino, Luca, Luengo, David (2018) Adaptive independent sticky MCMC algorithms. EURASIP Journal on Advances in Signal Processing, 2018 . Article Number 5. ISSN 1687-6180. (doi:10.1186/s13634-017-0524-6) (KAR id:65717)


Monte Carlo methods have become essential tools to solve complex Bayesian inference problems in different fields, such as computational statistics, machine learning, and statistical signal processing. In this work, we introduce a novel class of adaptive Monte Carlo methods, called adaptive independent sticky Markov Chain Monte Carlo (MCMC) algorithms, to sample efficiently from any bounded target probability density function (pdf). The new class of algorithms employs adaptive non-parametric proposal densities, which become closer and closer to the target as the number of iterations increases. The proposal pdf is built using interpolation procedures based on a set of support points which is constructed iteratively from previously drawn samples. The algorithm’s efficiency is ensured by a test that supervises the evolution of the set of support points. This extra stage controls the computational cost and the convergence of the proposal density to the target. Each part of the novel family of algorithms is discussed and several examples of specific methods are provided. Although the novel algorithms are presented for univariate target densities, we show how they can be easily extended to the multivariate context by embedding them within a Gibbs-type sampler or the hit and run algorithm. The ergodicity is ensured and discussed. An overview of the related works in the literature is also provided, emphasizing that several well-known existing methods (like the adaptive rejection Metropolis sampling (ARMS) scheme) are encompassed by the new class of algorithms proposed here. Eight numerical examples (including the inference of the hyper-parameters of Gaussian processes, widely used in machine learning for signal processing applications) illustrate the efficiency of sticky schemes, both as stand-alone methods to sample from complicated one-dimensional pdfs and within Gibbs samplers in order to draw from multi-dimensional target distributions.

Item Type: Article
DOI/Identification number: 10.1186/s13634-017-0524-6
Uncontrolled keywords: Bayesian inference, Monte Carlo methods, Adaptive Markov chain Monte Carlo (MCMC), Adaptive rejection Metropolis sampling (ARMS), Gibbs sampling, Metropolis-within-Gibbs, Hit and run algorithm
Subjects: H Social Sciences > HA Statistics
Divisions: Divisions > Division of Computing, Engineering and Mathematical Sciences > School of Mathematics, Statistics and Actuarial Science
Depositing User: Fabrizio Leisen
Date Deposited: 11 Jan 2018 19:36 UTC
Last Modified: 04 Mar 2024 16:25 UTC
Resource URI: (The current URI for this page, for reference purposes)

University of Kent Author Information

  • Depositors only (login required):

Total unique views for this document in KAR since July 2020. For more details click on the image.