Sitemap

A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.

Pages

Posts

Future Blog Post

less than 1 minute read

Published:

This post will show up by default. To disable scheduling of future posts, edit config.yml and set future: false.

Blog Post number 4

less than 1 minute read

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

Blog Post number 3

less than 1 minute read

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

Blog Post number 2

less than 1 minute read

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

Blog Post number 1

less than 1 minute read

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

portfolio

publications

Accelerating Proximal Markov Chain Monte Carlo by Using an Explicit Stabilized Method

Published in SIAM Journal on Imaging Sciences, (arXiv, code), 2020

We present a highly efficient proximal Markov chain Monte Carlo methodology to perform Bayesian computation in imaging problems. Similarly to previous proximal Monte Carlo approaches, the proposed method is derived from an approximation of the Langevin diffusion. However, instead of the conventional Euler–Maruyama approximation that underpins existing proximal Monte Carlo methods, here we use a state-of-the-art orthogonal Runge–Kutta–Chebyshev stochastic approximation [A. Abdulle, I. Aimuslimani, and G. Vilmart, SIAM/ASA J. Uncertain. Quantif., 6 (2018), pp. 937–964] that combines several gradient evaluations to significantly accelerate its convergence speed, similarly to accelerated gradient optimization methods. The proposed methodology is demonstrated via a range of numerical experiments, including non-blind image deconvolution, hyperspectral unmixing, and tomographic reconstruction, with total-variation and $\ell_1$-type priors. Comparisons with Euler-type proximal Monte Carlo methods confirm that the Markov chains generated with our method exhibit significantly faster convergence speeds, achieve larger effective sample sizes, and produce lower mean-square estimation errors at equal computational budget.

Recommended citation: Marcelo Pereyra, Luis A. Vargas-Mieles, and Konstantinos C. Zygalakis, "Accelerating Proximal Markov Chain Monte Carlo by Using an Explicit Stabilized Method", SIAM J. Imaging Sci., Vol. 13, No. 2, 2020, pp. 87-118. https://doi.org/10.1137/19M1283719

The forward–backward envelope for sampling with the overdamped Langevin algorithm

Published in Statistics and Computing, (arXiv), 2023

In this paper, we analyse a proximal method based on the idea of forward–backward splitting for sampling from distributions with densities that are not necessarily smooth. In particular, we study the non-asymptotic properties of the Euler–Maruyama discretization of the Langevin equation, where the forward–backward envelope is used to deal with the non-smooth part of the dynamics. An advantage of this envelope, when compared to widely-used Moreu–Yoshida one and the MYULA algorithm, is that it maintains the MAP estimator of the original non-smooth distribution. We also study a number of numerical experiments that support our theoretical findings.

Recommended citation: Armin Eftekhari, Luis A. Vargas-Mieles, and Konstantinos C. Zygalakis, "The forward–backward envelope for sampling with the overdamped Langevin algorithm", Statistics and Computing, Vol. 33, No. 85, 2023. https://link.springer.com/article/10.1007/s11222-023-10254-y

The split Gibbs sampler revisited: improvements to its algorithmic structure and augmented target distribution

Published in SIAM Journal on Imaging Sciences, (arXiv, code), 2023

This paper proposes a new accelerated proximal Markov chain Monte Carlo (MCMC) methodology to perform Bayesian computation efficiently in imaging inverse problems. The proposed methodology is derived from the Langevin diffusion process and stems from tightly integrating two state-of-the-art proximal Langevin MCMC samplers, SK-ROCK and split Gibbs sampling (SGS), which employ distinctively different strategies to improve convergence speed. More precisely, we show how to integrate, at the level of the Langevin diffusion process, the proximal SK-ROCK sampler which is based on a stochastic Runge-Kutta-Chebyshev approximation of the diffusion, with the model augmentation and relaxation strategy that SGS exploits to speed up Bayesian computation at the expense of asymptotic bias. This leads to a new and faster proximal SK-ROCK sampler that combines the accelerated quality of the original SK-ROCK sampler with the computational benefits of augmentation and relaxation. Moreover, rather than viewing the augmented and relaxed model as an approximation of the target model, positioning relaxation in a bias-variance trade-off, we propose to regard the augmented and relaxed model as a generalisation of the target model. This then allows us to carefully calibrate the amount of relaxation in order to simultaneously improve the accuracy of the model (as measured by the model evidence) and the sampler convergence speed. To achieve this, we derive an empirical Bayesian method to automatically estimate the optimal amount of relaxation by maximum marginal likelihood estimation. The proposed methodology is demonstrated with a range of numerical experiments related to image deblurring and inpainting, as well as with comparisons with alternative approaches from the state of the art.

Recommended citation: Marcelo Pereyra, Luis A. Vargas-Mieles, and Konstantinos C. Zygalakis, "The Split Gibbs Sampler Revisited: Improvements to Its Algorithmic Structure and Augmented Target Distribution", SIAM J. Imaging Sci., Vol. 16, No. 4, 2023, pp. 2040-2071. https://doi.org/10.1137/22M1506122

talks

Hausdorff School on MCMC

Published:

Talk: Accelerating proximal Markov chain Monte Carlo by using an explicit stabilised method. More details here.

teaching

Tutor, 2018 - 2022, School of Mathematics, University of Edinburgh

Undergraduate & Postgraduate course, School of Mathematics, University of Edinburgh, 2018

Courses: Calculus and its Applications, Engineering Mathematics, Several Variable Calculus and Differential Equations, Applied Stochastic Differential Equations, Numerical Partial Differential Equations.