Seminar| Institute of Mathematical Sciences
Time: Wednesday, October 23th, 2024 , 10:30-11:30
Location: IMS, RS408
Speaker: Xuefei Cao, Nankai University
Abstract: In Bayesian statistics, inferring the posterior distribution of unknown parameters often requires evaluating the likelihood function of the observed data. However, in some complex statistical models, the likelihood function is challenging to evaluate, making likelihood-based Bayesian inference difficult to implement. Approximate Bayesian computation (ABC), a simulation inference method without likelihood function, has gained considerable attention. While MCMC algorithms are commonly used for ABC inference, they often get trapped in local optima due to their inherent local exploration mechanism. We propose a novel Global-Local ABC-MCMC algorithm that combines the exploration capabilities of global proposals with the exploitation finesse of local proposals. By integrating iterative importance resampling, we establish an effective global proposal distribution and prove faster convergence under certain conditions compared to standard MCMC algorithms. The local sampler is optimized using Langevin dynamics and common random numbers. Additionally, we introduce two adaptive schemes: selecting the optimal hyper-parameters based on expected squared jumped distance, and using a normalizing flow-based model to iteratively improve the importance sampling proposal. Numerical experiments show that our method enhances sampling efficiency and achieves more reliable convergence for complex posteriors.