学术报告(何志坚 2024.12.2)

Unbiased Markov chain quasi-Monte Carlo for Gibbs samplers

发布人:姚璐 发布日期:2024-11-21
主题
Unbiased Markov chain quasi-Monte Carlo for Gibbs samplers
活动时间
-
活动地址
新数学楼406
主讲人
何志坚 教授(华南理工大学)
主持人
成诚 副教授

摘要:In statistical analysis, Monte Carlo (MC) stands as a classical numerical integration method. When encountering challenging sample problem, Markov chain Monte Carlo (MCMC) is a commonly employed method. However, the MCMC estimator is biased after a fixed number of iterations. Unbiased MCMC, an advancement achieved through coupling techniques, addresses this bias issue in MCMC. It allows us to run many short chains in parallel. Quasi-Monte Carlo (QMC), known for its high order of convergence, is an alternative of MC. By incorporating the idea of QMC into MCMC, Markov chain quasi-Monte Carlo (MCQMC) effectively reduces the variance of MCMC, especially in Gibbs samplers. This work presents a novel approach that integrates unbiased MCMC with MCQMC, called as an unbiased MCQMC method. This method renders unbiased estimators while improving the rate of convergence significantly. Numerical experiments demonstrate that for Gibbs sampling, unbiased MCQMC with a sample size of N yields a faster root mean square error (RMSE) rate than the O(N^{-1/2}) rate of unbiased MCMC, toward an RMSE rate of O(N^{-1}) for low-dimensional problems. Surprisingly, in a challenging problem of 1049-dimensional P\'olya Gamma Gibbs sampler, the RMSE can still be reduced by several times for moderate sample sizes. In the setting of parallelization, unbiased MCQMC also performs better than unbiased MCMC, even running with short chains. This is joint work with Jiarui Du.