Non-asymptotic Analysis of MCMC Error in Bayesian Regression Models and Efficiency of MCMC Thinning
This talk is divided into two parts. In the first part, I will discuss joint work with Jorge Carlos Román on bounding the mean square error of Markov Chain Monte Carlo (MCMC) estimates of posterior quantities in the Bayesian regression model. To accomplish this, we establish drift and minorization conditions for associated Markov chains, and establish a V-norm condition for functions of interest. This leads to an explicit upper bound on the MSE as a function of the MCMC sample size. In the second part of the talk, I discuss the question, “When is it efficient to thin or sub-sample the output of a Markov chain”? I will present results by Art Owen from Stanford on Markov chain thinning efficiency for generic auto-correlations and examine the specific case when the autocorrelations follow an AR(1) process.