Block Gibbs samplers for logistic mixed models: convergence properties and a comparison with full Gibbs samplers

11 Jan 2021  ·  Yalin Rao, Vivekananda Roy ·

The logistic linear mixed model (LLMM) is one of the most widely used statistical models. Generally, Markov chain Monte Carlo algorithms are used to explore the posterior densities associated with the Bayesian LLMMs. Polson, Scott and Windle's (2013) Polya-Gamma data augmentation (DA) technique can be used to construct full Gibbs (FG) samplers for the LLMMs. Here, we develop efficient block Gibbs (BG) samplers for Bayesian LLMMs using the Polya-Gamma DA method. We compare the FG and BG samplers in the context of a real data example, as the correlation between the fixed effects and the random effects changes as well as when the dimensions of the design matrices vary. These numerical examples demonstrate superior performance of the BG samplers over the FG samplers. We also derive conditions guaranteeing geometric ergodicity of the BG Markov chain when the popular improper uniform prior is assigned on the regression coefficients, and proper or improper priors are placed on the variance parameters of the random effects. This theoretical result has important practical implications as it justifies the use of asymptotically valid Monte Carlo standard errors for Markov chain based estimates of the posterior quantities.

PDF Abstract
No code implementations yet. Submit your code now


Methodology 60J05, 62-08


  Add Datasets introduced or used in this paper