This talk introduces a variational inference method popular in Bayesian inference for Machine learning and a recently developed amortized variational inference method. In amortized variational inference, local variational parameters determining posterior approximations for the latent variables are parametrized more parsimoniously as a function of global variational parameters and local data. The reduction of the number of variational parameters that amortization brings leads to vary fast algorithm for fitting computationally expensive models. As an application, this talk introduce the detection of outlying clusters in generalized linear mixed models using an approach where repeated computations of posterior distributions for random effects are needed.
This talk introduces a variational inference method popular in Bayesian inference for Machine learning and a recently developed amortized variational inference method. In amortized variational inference, local variational parameters determining posterior approximations for the latent variables are parametrized more parsimoniously as a function of global variational parameters and local data. The reduction of the number of variational parameters that amortization brings leads to vary fast algorithm for fitting computationally expensive models. As an application, this talk introduce the detection of outlying clusters in generalized linear mixed models using an approach where repeated computations of posterior distributions for random effects are needed.