Variational Inference for Bayesian Bridge Regression
Conference
64th ISI World Statistics Congress
Format: CPS Abstract
Keywords: bayesian, bridge, penalization, splines
Session: CPS 23 - Bayesian statistics
Monday 17 July 4 p.m. - 5:25 p.m. (Canada/Eastern)
Abstract
The bridge approach for regularization of coefficients in regression models uses α norm, with α ∈ (0, +∞), to define a penalization on large values of the regression coefficients. Particular cases include the lasso and ridge penalizations. In Bayesian models, the penalization is enforced by a prior distribution on the coefficients. Although MCMC approaches are available for Bayesian bridge regression, they can be very slow for large datasets, specially in high dimensions. This paper develops an implementation of Automatic Differentiation Variational Inference for Bayesian inference on semi-parametric regression models with bridge penalization. The non-parametric effects of covariates are modelled by B-splines. The proposed inference procedure allows the use of small batches of data at each iteration (due to stochastic gradient based updates), therefore drastically reducing computational time in comparison with MCMC. Full Bayesian inference is preserved so joint uncertainty estimates for all model parameters are available. A simulation study shows the main properties of the proposed method and an application to a large real dataset is presented.