Topic > Bayesian analysis for a class of beta mixed models

Over the last 10 years there has been growing interest in the class of generalized linear mixed models (GLMMs). One possible reason for such popularity is that GLMMs combine the cited generalized linear models (GLMs) {Nelder1972} with Gaussian random effects, adding flexibility to the models and accommodating complex data structures such as hierarchical, repeated, longitudinal measures, among others that typically induce extra variability and/or dependence. GLMMs can also be seen as a natural extension of the mentioned linear mixed models {Pinheiro:2000}, allowing flexible distributions to the response variables. Common choices are Gaussian for continuous data, Poisson and negative binomial for count data, and binomial for binary data. These three situations comprise the majority of applications within this class of models. Some examples can be found in citep{Breslow:1993} and citep{Molenberghs:2005}. Despite this flexibility, there are situations in which the response variable is continuous but limited such as rates, percentages, indices and proportions. In these situations the traditional GLMM, based on the Gaussian distribution, is not adequate, since the bounded distribution is ignored. One approach used to model this type of data is based on the beta distribution. The beta distribution is very flexible with a density function that can exhibit quite different shapes, including left- or right-skewed, symmetric, J-shaped, and inverted J-shaped citep{Da-Silva2011}. Regression models proposed for independent and identically distributed beta variables to cite{Paolino2001}, cite{Kieschnick2003} and cite{Ferrari2004}. The basic assumption is that the response follows a beta law whose expected value is related to a linear predictor through a link function...... at the center of the document ......results that help to choose the previous distributions. The goal of this article is therefore to present Bayesian inference for beta mixed models using INLA. We discuss the choice of prior distributions and measures of model comparisons. The results obtained by INLA are compared with those obtained using an MCMC algorithm and a likelihood analysis. The model is illustrated with the analysis of a real dataset, previously analyzed by citet{Bonat2013}. Further care is given to the choice of prior distributions for the beta law precision parameter. The structure of this article is as follows. In Section 2, we define the Bayesian beta mixed model, in Section 3 we describe the Integrated Nested Laplace Approximation (INLA). In Section 4, the model for the motivating example is introduced and the analysis results are presented. We close with the final considerations in the Section 5.