| 4 Dec |
Giorgos Vasdekis
|
Sampling with time-changed Markov processes
▶ Abstract
We introduce a framework of time-changed Markov processes to speed up the convergence of Markov chain Monte Carlo (MCMC) algorithms in the context of multimodal distributions and rare event simulation. The time-changed process is defined by adjusting the speed of time of a base process via a user-chosen, state-dependent function. We apply this framework to several Markov processes from the MCMC literature, such as Langevin diffusions and piecewise deterministic Markov processes, obtaining novel modifications of classical algorithms and also re-discovering known MCMC algorithms. We prove theoretical properties of the time-changed process under suitable conditions on the base process, focusing on connecting the stationary distributions and qualitative convergence properties such as geometric and uniform ergodicity, as well as a functional central limit theorem. Time permitting, we will compare our approach with the framework of space transformations, clarifying the similarities between the approaches. This is joint work with Andrea Bertazzi.
|
Slides
|
| 20 Nov |
Lanya Yang
Lancaster University
|
Exchangeable Particle Gibbs for Markov Jump Processes
▶ Abstract
Inference in stochastic reaction-network models—such as the SEIR epidemic model or the Lotka–Volterra predator–prey system—is crucial for understanding the dynamics of interacting systems in epidemiology, ecology, and systems biology. These models are typically represented as Markov jump processes (MJPs) with intractable likelihoods. As a result, particle Markov chain Monte Carlo (particle MCMC) methods, particularly the Particle Gibbs (PG) sampler, have become standard tools for Bayesian inference. However, PG suffers from severe particle degeneracy, especially in high-dimensional state spaces, leading to poor mixing and inefficiency. In this talk, I focus on improving the efficiency of particle MCMC methods for inference in reaction networks by addressing the degeneracy problem. Building on recent work on the Exchangeable Particle Gibbs (xPG) sampler for continuous-state diffusions, this project develops a novel version of xPG tailored to discrete-state reaction networks, where randomness is driven by Poisson processes rather than Brownian motion. The proposed method retains the exchangeability framework of xPG while adapting it to the structural and computational challenges specific to reaction networks.
|
Slides
|
| 30 Oct |
Rui Zhang
Lancaster University
|
A Dynamic Perspective of Matern Gaussian Processes
▶ Abstract
The ubiquitous Gaussian process (GP) models in statistics and machine learning (Williams and Rasmussen; 2006) are static by default, either using the weight-space or function-space views (Kanagawa et al.; 2025), where the observation and test locations have no unilateral dependency order, and this also explains the cubic scalability in computational costs for GP regressions. On the other hand, the dynamic view of Gaussian processes, while only available for a class of GP models, reformulates the dependency structure unilaterally (Whittle; 1954) to enable sequential inferences for GP regressions with computational costs that could scale linearly (Hartikainen and Sarkka;2010; Sarkka and Hartikainen; 2012) with little to no approximation. This talk explores this dynamic perspective of (Matern) Gaussian processes and some consequences of this perspective.
|
Slides
HTML Slides
|
| 16 Oct |
Henry Moss
Lancaster University
|
GPGreen: Linear Operator Learning with Gaussian Processes
|
|
| 4 Sep |
Rafael Izbicki
Federal University of São Carlos, Brazil
|
Simulation‑Based Calibration of Confidence Sets for Statistical Models
|
|
| 7 Aug |
Jixiang Qing
Imperial College London
|
Bayesian Optimization Over Graphs With Shortest-Path Encodings
|
|
| 17 Jul |
Maciej Buze
|
Barycenters in Unbalanced Optimal Transport
|
|
| 3 Jul |
Henry Moss
|
Fusing Neural and Statistical Models
|
|
| 19 Jun |
Takuo Matsubara
University of Edinburgh
|
Wasserstein Gradient Boosting: A Framework for Distribution-Valued Supervised Learning
|
NeurIPS
|
| 12 Jun |
Augustin Chevallier
University of Strasbourg
|
Towards Practical PDMP Sampling: Metropolis Adjustments, Locally Adaptive Step-Sizes, and NUTS-Based Time Lengths
|
|
| 29 May |
Dennis Prangle
University of Bristol
|
Distilling Importance Sampling for Likelihood Free Inference
|
JCGS
|
| 15 May |
Yuga Iguchi
|
A Closed-Form Transition Density Expansion for Elliptic and Hypo-Elliptic SDEs
|
|
| 7 May |
Liam Llamazares Elias
|
A Parameterization of Anisotropic Gaussian Fields With Penalized Complexity Priors
|
|
| 20 Mar |
Chris Nemeth
|
Can ODEs Make Monte Carlo Methods Great Again?
|
|
| 6 Mar |
Richard Everitt
University of Warwick
|
ABC-SMC^2 and Ensemble Kalman Inversion ABC
|
|
| 20 Feb |
Paul Fearnhead
|
Optimised Annealed Sequential Monte Carlo Samplers
|
|
| 6 Feb |
Adrien Corenflos
University of Warwick
|
High-Dimensional Inference in State-Space Models via an Auxiliary Variable Trick
|
|
| 30 Jan |
Tim Rogers
University of Sheffield
|
Learning About Dynamical Systems with Gaussian Processes
|
|
| 10 Dec |
Lorenzo Rimella
University of Turin
|
Categorical Approximate Likelihood for individual-based models
|
|
| 28 Nov |
Connie Trojan
|
Diffusion Generative Modelling for Divide-and-Conquer MCMC
|
|
| 21 Nov |
Maximillian Steffen
Karlsruhe Institute of Technology
|
Statistical guarantees for stochastic Metropolis-Hastings
|
|
| 27 Jun |
Chris Sherlock
|
Tuning pseudo-marginal Metropolis-Hastings: a vase or two faces?
|
|
| 20 Jun |
Claire Gormley
University College Dublin
|
Bayesian nonparametric modelling of network data
|
|
| 13 Jun |
Saifuddin Syed
University of Oxford
|
Scaling inference of MCMC algorithms with parallel computing
|
JRSSB
|
| 6 Jun |
Rui Zhang
|
Unadjusted Barker as an SDE Numerical Scheme
|
|
| 16 May |
Wentao Li
|
Optimal combination of composite likelihoods using approximate Bayesian computation with application to state-space models
|
|
| 9 May |
Gabriel Wallin
|
Rotation to Sparse Loadings using Lp Losses and Related Inference Problems
|
|
| 11 Apr |
François-Xavier Briol
|
Robust and conjugate Gaussian process regression
|
|
| 21 Mar |
Leandro Marcolino
|
Identifying Adversaries in Ad-hoc Domains Using Q-valued Bayesian Estimations
|
AAMAS
|
| 14 Mar |
Theo Papamarkou
|
Aspects of sampling-based inference for Bayesian neural networks
|
|
| 7 Mar |
Tamas Papp
|
Simulating the independence sampler parallel-in-time
|
|
| 22 Feb |
Francesco Barile
|
Flexible modeling of heterogeneous populations of networks: a Bayesian nonparametric approach
|
|
| 15 Feb |
Kamélia Daudel
|
Alpha-divergence Variational Inference Meets Importance Weighted Auto-Encoders: Methodology and Asymptotics
|
JMLR
|
| 8 Feb |
Estevão Prado
|
Accounting for shared covariates in semi-parametric Bayesian additive regression trees
|
|
| 1 Feb |
Lorenzo Rimella
|
A State-Space Perspective on Modelling and Inference for Online Skill Rating
|
|
| 14 Dec |
Chris Jewell
|
Data Augmentation MCMC on epidemic models
|
|
| 7 Dec |
Marina Riabiz
|
Optimal Thinning of MCMC Output
|
JRSSB
|
| 30 Nov |
Lorenzo Rimella
|
Simulation Based Composite Likelihood
|
|
| 23 Nov |
Andy Wang
|
Comparison theorems for Hybrid Slice Sampling
|
|
| 16 Nov |
Chris Sherlock
|
Ensemble Kalman filter
|
|
| 9 Nov |
Chris Nemeth
|
Bayesian Flow Networks
|
|
| 2 Nov |
Aretha Teckentrup
|
Gaussian processes, inverse problems and Markov chain Monte Carlo
|
|
| 26 Oct |
Sam Holdstock
|
Improved inference for stochastic kinetic models with small observation error via partial Rao-Blackwellisation
|
|
| 19 Oct |
Estevão Prado
|
Metropolis-Hastings with fast, flexible sub-sampling
|
|
| 12 Oct |
Alberto Cabezas
|
Composable Inference in BlackJAX
|
|
| 29 Jun |
Tamas Papp
|
Introduction to diffusion generative models
|
|
| 22 Jun |
Chris Sherlock
|
Fast return-level estimates for flood insurance via an improved Bennett inequality for random variables with differing upper bounds
|
Ongoing work
|
| 15 Jun |
Alice Corbella
University of Warwick
|
The Lifebelt Particle Filter for robust estimation from low-valued count data
|
|
| 8 Jun |
Francesca Panero
London School of Economics
|
Modelling sparse networks with Bayesian nonparametrics
|
Ongoing work
|
| 18 May |
Lorenzo Rimella
|
Localised filtering algorithm: the BPF and the Graph Filter
|
Link
Link
|
| 11 May |
Francesca Crucinio
ENSAE
|
Divide-and-Conquer SMC with applications to high dimensional filtering
|
Statistica Sinica
|
| 27 Apr |
Paul Fearnhead
|
Automatic Differentiation of Programs with Discrete Randomness
|
NeurIPS
|
| 2 Mar |
Chris Sherlock
|
KSD for dummies
|
|
| 23 Feb |
Victor Elvira
University of Edinburgh
|
State-Space Models as Graphs
|
|
| 16 Feb |
Sam Livingstone
University College London
|
Pre-conditioning in Markov chain Monte Carlo
|
Ongoing work
|
| 26 Jan |
Estevao Batista Do Prado
|
Bayesian additive regression trees (BART)
|
Link
Link
|
| 19 Jan |
Alberto Cabezas Gonzalez
|
Stereographic Markov Chain Monte Carlo
|
|
| 12 Jan |
Alexander Terenin
University of Cambridge
|
Pathwise Conditioning and Non-Euclidean Gaussian Processes
|
|
| 15 Dec |
Tamas Papp
|
Coupling MCMC algorithms in high dimensions
|
|
| 8 Dec |
Yu Luo
|
Bayesian estimation using loss functions
|
|
| 24 Nov |
Sam Power
University of Bristol
|
Explicit convergence bounds for Metropolis Markov chains: isoperimetry, spectral gaps and profiles
|
|
| 17 Nov |
Mauro Camara Escudero
University of Bristol
|
Approximate Manifold Sampling
|
|
| 3 Nov |
Alexandros Beskos
UCL
|
Manifold Markov chain Monte Carlo methods for Bayesian inference in diffusion models
|
|
| 27 Oct |
Jure Vogrinc
University of Warwick
|
The Barker proposal: Combining robustness and efficiency in gradient-based MCMC
|
|
| 20 Oct |
Paul Fearnhead
|
Martingale posterior distributions
|
|
| 6 Oct |
Michael Whitehouse
University of Bristol
|
Consistent and fast inference in compartmental models of epidemics using PAL
|
|
| 29 Sep |
Chris Sherlock
|
Comparison of Markov chains via weak Poincaré inequalities with application to pseudo-marginal MCMC
|
|
| 22 Sep |
Lorenzo Rimella
|
Inference in Stochastic Epidemic Models via Multinomial Approximations
|
|
| 15 Sep |
Chris Nemeth
|
Metropolis–Hastings via Classification
|
|
| 23 Jun |
Paul Fearnhead
|
Non-Reversible Parallel Tempering: a Scalable Highly Parallel MCMC Scheme
|
|
| 9 Jun |
Augustin Chevallier
|
Continuously-Tempered PDMP samplers
|
|
| 26 May |
Chris Sherlock
|
Scalable Importance Tempering and Bayesian Variable Selection
|
|
| 5 May |
Steffen Grünewälder
|
Compressed Empirical Measures (in finite dimensions)
|
|
| 31 Mar |
Louis Sharrock
University of Bristol
|
Parameter Estimation for the McKean-Vlasov Stochastic Differential Equation
|
|
| 24 Mar |
Alberto Cabezas Gonzalez
|
Elliptical slice sampling
|
|
| 17 Mar |
Augustin Chevallier
|
Slice sampling & PDMP
|
|
| 3 Mar |
Paul Fearnhead
|
Boost your favorite MCMC sampler using Kac’s theorem: the Kick-Kac teleportation algorithm- Part 2
|
|
| 24 Feb |
Paul Fearnhead
|
Boost your favorite MCMC sampler using Kac’s theorem: the Kick-Kac teleportation algorithm- Part 1
|
|
| 17 Feb |
Lionel Riou-Durand
University of Warwick
|
Metropolis Adjusted Underdamped Langevin Trajectories: a robust alternative to Hamiltonian Monte-Carlo
|
|
| 3 Feb |
Chris Sherlock
|
Statistical scalability and approximate inference in distributed computing environments
|
|
| 27 Jan |
Lorenzo Rimella
|
The Concrete Distribution: A Continuous Relaxation of Discrete Random Variables
|
|
| 20 Jan |
Augustin Chevallier
|
Non-reversible guided Metropolis kernel
|
|
| 13 Jan |
Szymon Urbas
|
The Apogee to Apogee Path Sampler
|
|
| 9 Dec |
Chris Sherlock
|
Metropolis-Hastings with Averaged Acceptance Ratios
|
|
| 2 Dec |
Chris Nemeth
|
Waste-free sequential Monte Carlo
|
|
| 25 Nov |
Sam Power
University of Bristol
|
Double Control Variates for Gradient Estimation in Discrete Latent Variable Models
|
|
| 18 Nov |
Chris Nemeth
|
How do you tune MCMC algorithms?
|
|
| 4 Nov |
Augustin Chevallier
|
Approximations of Piecewise Deterministic Markov Processes and their convergence properties
|
|
| 28 Oct |
Paul Fearnhead
|
Multilevel Linear Models, Gibbs Samplers and Multigrid Decompositions
|
|
| 14 Oct |
Tamas Papp
|
Estimating Markov chain convergence with empirical Wasserstein distance bounds
|
|
| 22 Jun |
Gael Martin
Monash University
|
landmark papers: Bayesian computation from 1763 to the 21st Century
|
|
| 10 Jun |
Phyllis Ju
Harvard university
|
Sequential Monte Carlo algorithms for agent-based models of disease transmission
|
|
| 27 May |
Christian P. Robert
Université Paris-Dauphine
|
landmark papers: Harold Jeffreys’s Theory of Probability Revisited
|
|
| 13 May |
Lorenzo Rimella
|
Dynamic Bayesian Neural Networks
|
|
| 29 Apr |
Clement Lee
|
landmark papers: The Gelman-Rubin statistic: old and new
|
|
| 15 Apr |
George Bolt
|
MCMC Sampling and Posterior Inference for a New Metric-Based Network Model
|
|
| 25 Mar |
Jeremie Coullon
|
landmark papers: the Metropolis sampler (1953)
|
|
| 4 Mar |
Chris Sherlock
|
Differentiable Particle Filtering via Entropy-Regularized Optimal Transport
|
|
| 19 Nov |
Liam Hodgkinson
|
Stein kernels
|
|
| 3 Mar |
Chris Nemeth
|
Deep generative modelling: autoencoders, VAEs, GANs…. and all that jazz! Part 2
|
|
| 27 Feb |
Chris Nemeth
|
Deep generative modelling: autoencoders, VAEs, GANs…. and all that jazz!
|
|
| 13 Feb |
Leah South
|
The kernel Stein discrepancy
|
|
| 5 Dec |
Paul Fearnhead
|
Zig Zag Sampler
|
|
| 24 Nov |
Francois-Xavier Briol
|
Statistical Inference for Generative Models with Maximum Mean Discrepancy
▶ Abstract
Likelihood-based inference and its variants provide a statistically efficient and widely applicable approach to parametric inference, their application to models involving intractable likelihoods poses challenges. In this work, we study a class of minimum distance estimators for intractable generative models, that is, statistical models for which the likelihood is intractable, but simulation is cheap. The distance considered, maximum mean discrepancy (MMD), is defined through the embedding of probability measures into a reproducing kernel Hilbert space.
|
|
| 7 Nov |
Jeremias Knoblauch
|
Generalized variational inference
▶ Abstract
In this talk, I introduce a generalized representation of Bayesian inference. It is derived axiomatically, recovering existing Bayesian methods as special cases. It is then used to prove that variational inference (VI) based on the Kullback-Leibler Divergence with a variational family Q produces the optimal Q-constrained approximation to the exact Bayesian inference problem. Surprisingly, this implies that standard VI dominates any other Q-constrained approximation to the exact Bayesian inference problem. This means that alternative Q-constrained approximations such as VI minimizing other divergences and Expectation Propagation can produce better posteriors than VI only by implicitly targeting more appropriate Bayesian inference problems.
|
Slides
|
| 9 Oct |
Magnus Rattray
|
Using Gaussian processes to infer pseudotime and branching from single-cell data.
▶ Abstract
I will describe some applications of Gaussian process models to single-cell data. We have developed a scalable implementation of the Gaussian process latent variable model (GPLVM) that can be used for pseudotime estimation when there is prior knowledge about pseudotime, e.g. from capture times available in single-cell time course data [1]. Other dimensions of the GPLVM latent space can then be used to model additional sources of variation, e.g. from branching of cells into different lineages.
|
|
| 20 Sep |
Sam Livingstone
|
On the robustness of gradient-based MCMC algorithms.
▶ Abstract
We analyse the tension between robustness and efficiency for Markov chain Monte Carlo (MCMC) sampling algorithms. In particular, we focus on robustness of MCMC algorithms with respect to heterogeneity in the target and their sensitivity to tuning, an issue of great practical relevance but still understudied theoretically. We show that the spectral gap of the Markov chains induced by classical gradient-based MCMC schemes (e.g. Langevin and Hamiltonian Monte Carlo) decays exponentially fast in the degree of mismatch between the scales of the proposal and target distributions, while for the random walk Metropolis (RWM) the decay is linear.
|
|
| 1 Aug |
Leah South
|
Variance reduction in MCMC.
|
|
| 13 Jun |
Clement Lee
|
Clustering approach and MCMC practicalities of stochastic block models.
▶ Abstract
Stochastic block model (SBM) is a popular choice for clustering nodes in a network. In this talk, a few versions of SBM will be reviewed, with the focus on the clustering approach (hard vs soft), and its relation with the subsequent MCMC algorithm. Model selection and some practical issues will also be discussed.
|
|
| 9 May |
Nick Tawn
|
The Annealed Leap Point Sampler (ALPS) for multimodal target distributions.
▶ Abstract
Sampling from multimodal target distributions is a classical challenging problem. Markov Chain Monte Carlo methods typically rely on localised or gradient based proposal mechanisms and so target distributions exhibiting multimodality mean the chain becomes trapped in a local mode and this results in a bias sample output. This talk introduces a novel algorithm, ALPS, that is designed to provide a scalable approach to sampling from multimodal target distributions. The ALPS algorithm concatenates a number of the strengths of the current gold standard approaches for multimodality.
|
|
| 28 Mar |
Callum Vyner
|
An Introduction to Divide-and-Conquer MCMC.
|
|
| 28 Feb |
Matthew Ludkin
|
Hug ‘N’ Hop: Explicit, non-reversible, contour-hugging MCMC.
|
|
| 14 Feb |
Henry Moss
|
An Intro to Information-Driven Bayesian Optimisation
|
|
| 13 Dec |
Arnaud Doucet
|
On discrete-time piecewise-deterministic MCMC schemes
|
|
| 5 Dec |
Louis Aslett
|
Privacy and Security in Bayesian Inference
|
|
| 15 Nov |
Chris Sherlock
|
The Minimal Extended Statespace Algorithm for exact inference on Markov jump processes
|
|
| 7 Dec |
Gareth Ridall
|
Sequential Bayesian estimation and model selection
▶ Abstract
Work done in collaboration with Tony Pettitt from QUT Brisbane.I would like to: Introduce the Dirichlet form, which can be thought of as a generalisation of expected squared jumping distance, and show that the spectral gap has a variational representation over Dirichlet forms. Introduce the asymptotic variance of a Markov chain, which is the theoretical equivalent of the practical measure of 1/effective sample size, and provide a variational representation of this.
|
|
| 29 Nov |
Chris Nemeth
|
Pseudo-extended MCMC
▶ Abstract
MCMC algorithms are a class of exact methods used for sampling from target distributions. If the target is multimodal, MCMC algorithms often struggle to explore all of the modes of the target within a reasonable number of iterations. This issue can become even more pronounced when using efficient gradient-based samplers, such as HMC, which tend to tend to become trapped local modes. In this talk, I’ll outline how the pseudo-extended target, based on pseudo-marginal MCMC, can be used to improve the mixing of the HMC sampler by tempering the target distribution.
|
|
| 9 Nov |
Luke Kelly
|
Lateral trait transfer in phylogenetic inference
▶ Abstract
We are interested in inferring the phylogeny, or shared ancestry, of a set of species descended from a common ancestor. When traits pass vertically through ancestral relationships, the phylogeny is a tree and one can often compute the likelihood efficiently through recursions. Lateral transfer, whereby evolving species exchange traits outside of ancestral relationships, is a frequent source of model misspecification in phylogenetic inference. We propose a novel model of species diversification which explicitly controls for the effect of lateral transfer.
|
|
| 1 Nov |
Yee Whye Teh
|
On Bayesian Deep Learning and Deep Bayesian Learning
▶ Abstract
Probabilistic and Bayesian reasoning is one of the principle theoretical pillars to our understanding of machine learning. Over the last two decades, it has inspired a whole range of successful machine learning methods and influenced the thinking of many researchers in the community. On the other hand, in the last few years the rise of deep learning has completely transformed the field and led to a string of phenomenal, era-defining, successes.
|
|
| 18 May |
Chris Sherlock
|
Asymptotic variance and geometric convergence of MCMC: variational representations
▶ Abstract
An MCMC algorithm is geometrically ergodic if it converges to the intended posterior geometrically in the number of iterations. A number of useful properties follow from geometric ergodicity, including that the practical efficiency measure of “effective sample size” is meaningful for any sensible function of interest. The standard method for proving geometric ergodicity for a particular algorithm involves a “drift condition” and a “small set”, and can be time consuming, both in the proof itself and in understanding why the drift condition and small set are helpful.
|
|
| 26 Jan |
Chris Sherlock
|
Delayed-acceptance MCMC with examples: advantages and pitfalls and how to avoid the latter
▶ Abstract
When conducting MCMC using the Metropolis-Hastings algorithm the posterior distribution must be evaluated at the proposed point at every iteration; in many situations, however, the posterior is computationally expensive to evaluate. When a computationally cheap approximation to the posterior is also available, the delayed acceptance algorithm (aka surrogate transition method) can be used to increase the efficiency of the MCMC whilst still targeting the correct posterior. In the first part of this talk I will explain and justify the algorithm itself and overview a number of examples of its (successful) application.
|
|
| 6 Dec |
Jack Baker
|
An overview of Bayesian non-parametrics
|
|
| 11 Nov |
Wentao Li
|
Improved Convergence of Regression Adjusted Approximate Bayesian Computation
|
|
| 20 Oct |
Paul Fearnhead
|
The Scalable Langevin Exact Algorithm: Bayesian Inference for Big Data
|
|
| 2 Jul |
Adam Johansen
|
The iterated auxiliary particle filter
|
|
| 19 May |
Chris Sherlock
|
Pseudo-marginal MCMC using averages of unbiased estimators
|
|
| 9 May |
Joris Bierkens
University of Warwick
|
Super-efficient sampling using Zig Zag Monte Carlo
|
|
| 14 Apr |
Paul Fearnhead
|
Research opportunities with MCMC and Big Data
|
|
| 17 Mar |
Peter Neal
|
Optimal scaling of the independence sampler
|
|
| 25 Feb |
Paul Fearnhead
|
Continuous-Time Importance Sampling (and MCMC)
|
|
| 18 Feb |
Borja de Balle Pigem
|
Differentially Private Policy Evaluation
|
|
| 10 Dec |
Jack Baker
|
STAN
|
|
| 26 Nov |
Paul Fearnhead
|
Discussion of “The Bouncy Particle Sampler: A Non-Reversible Rejection-Free Markov Chain Monte Carlo Method”
|
Link
Link
|
| 15 Oct |
James Hensman
|
Variational inference in Gaussian process models
|
|
| 19 May |
Alexandre Thiery
National University of Singapore
|
Asymptotic Analysis of Random-Walk Metropolis on Ridged Densities
|
|
| 28 Apr |
Chris Sherlock
|
Delayed acceptance particle marginal random walk Metropolis algorithms and their optimisation
|
|
| 5 Mar |
Chris Nemeth
|
Bayesian Inference for Big Data: Current and Future Directions
|
|
| 18 Dec |
Wentao Li
|
Discussion of the RSS read paper: “Sequential Quasi Monte Carlo” by Mathieu Gerber and Nicolas Chopin.
|
|
| 28 Nov |
Chris Nemeth
|
Particle Metropolis adjusted Langevin algorithms
|
|
| 11 Mar |
Paul Fearnhead
|
Reparameterisations for Particle MCMC
|
|
| 25 Feb |
Vasileios Maroulas
University of Tennessee
|
Filtering, drift homotopy and target tracking
|
|
| 11 Dec |
Dennis Prangle
University of Bristol
|
Speeding ABC inference using early-stopping simulations
|
|
| 9 May |
Chris Sherlock
|
Properties and Optimisation of the Pseudo Marginal RWM.
|
|
| 17 Apr |
Anthony Lee
University of Warwick
|
Particle Markov chain Monte Carlo and marginal likelihood estimation: strategies for improvement.
|
|
| 22 Mar |
Dennis Prangle
|
Likelihood-free parameter estimation for state space models
|
|
| 20 Feb |
Joe Mellor
University of Manchester
|
Thompson Sampling in Switching Environments with Bayesian Online Change Point Detection
|
|
| 21 Jun |
Nicos Pavlidis
Lancaster University
|
Classification in Dynamic Streaming Environments
|
|
| 6 Jun |
Paul Fearnhead
|
Hamiltonian Monte Carlo: Beyond Kinetic Energy
|
|
| 22 May |
Chris Sherlock
|
Metropolis Adjusted Langevin Algorithm (MALA), simplified Manifold MALA, and Hamiltonian Monte Carlo: motivation, explanation and application
|
|
| 14 Feb |
Dennis Prangle
|
Summary statistics for ABC model choice
|
|
| 13 Dec |
Paul Fearnhead
|
Constructing summary statistics for approximate Bayesian computation: semi-automatic ABC
|
|
| 16 Nov |
Haeran Cho
London School of Economics
|
High-dimensional variable selection via tilting
|
|
| 17 Jun |
Gareth Ridall
|
Online inference and model selection using sequential Monte Carlo
|
|
| 24 May |
Chris Sherlock
|
Simulation of mixed speed biochemical reactions using the linear noise approximation
|
|
| 15 Mar |
Paul Fearnhead
|
Reading group on “An explicit link between Gaussian fields and Gaussian Markov random fields: The SPDE approach”
|
RSS
|
| 15 Feb |
Neil Drummond
Lancaster University
|
Quantum Monte Carlo
|
|
| 18 Jan |
Rebecca Killick
|
Optimal detection of changepoints with a linear computational cost
|
|
| 7 Dec |
Dennis Prangle
|
Using ABC for sequential Bayesian analysis
|
|
| 10 Nov |
Krzysztof Latuszynski
University of Warwick
|
Exact Inference for a Markov switching diffusion model with discretely observed data
|
|
| 3 Nov |
Anastasia Lykou
|
Bayesian variable selection using Lasso
|
|
| 10 Oct |
Paul Fearnhead
|
Reading group on “Riemann manifold Langevin and Hamiltonian Monte Carlo methods”
|
|
| 3 Sep |
Paul Fearnhead
|
Particle Filters for models with fixed parameters
|
|
| 16 Feb |
Gareth Ridall
|
Reading group on Particle MCMC and the pseudo marginal algorithm
|
|
| 1 Dec |
Chris Sherlock
|
The random walk Metropolis: general criteria for the 0.234 acceptance rate rule
|
|
| 3 Nov |
Giorgos Sermaidis
|
Likelihood based inference for discretely observed diffusions
|
|
| 20 Oct |
Paul Fearnhead
|
Sequential Importance Sampling for General Diffusion Models
|
|
| 28 Apr |
Chris Sherlock
Reading Group
|
The Integrated Nested Laplace Approximation of Rue et al. (2009)
|
RSS
|
| 2 Dec |
Paul Fearnhead
|
change point models and fault detection
|
|
| 28 Oct |
Chris Sherlock
|
Optimal scaling of the random walk Metropolis - Part 1
|
|
| 2 Jun |
Ben Taylor
|
Adaptive Sequential Monte Carlo Methods For Static Inference in Bayesian Mixture Analysis
|
|
| 13 May |
Joe Whittaker
|
The linear least squares prediction view of conjugate gradients
|
|
| 18 Mar |
Hongsheng Dai
|
Perfect sampling for Random Trees
|
|
| 4 Mar |
Dennis Prangle
|
An MCMC method for Approximate Bayesian Computation
|
|
| 5 Feb |
Paul Smith
|
Bayesian Analysis of ARMA and Transfer Function Time Series Models
|
|
| 28 Nov |
Chris Sherlock
|
Power sums of lognormals
|
|
| 21 Nov |
Thomas Jaki
|
Asymptotic simultaneous bootstrap confidence bounds for simple linear regression lines
|
|
| 31 Oct |
Paul Fearnhead
|
Using particle filters within MCMC
|
|