Honey, I shrunk the irrelevant effects! Simple and flexible approximate Bayesian regularization

In the social and behavioral sciences and related fields, statistical models are becoming increasingly complex with more parameters to explain intricate dependency structures among larger sets of variables. Regularization techniques, like penalized regression, help identify key parameters by shrinking negligible effects to zero, resulting in parsimonious solutions with strong predictive performance. This paper introduces a simple and flexible approximate Bayesian regularization (ABR) procedure, combining a Gaussian approximation of the likelihood with a Bayesian shrinkage prior to obtain a regularized posterior. Parsimonious (interpretable) solutions are obtained by taking the posterior modes. Parameter uncertainty is quantified using the full posterior. Implemented in the R package shrinkem, the method is evaluated in synthetic and empirical applications. Its flexibility is demonstrated across various models, including linear regression, relational event models, mediation analysis, factor analysis, and Gaussian graphical models.

Comments (0)

No login
gif