Sammendrag
We change the approach for computing posterior distributions in Bayesian Generalized Nonlinear Models. We replace MCMC with variational Bayes, and approximate the posterior distribution with mean-field, or through utilization of normalizing flows. Step by step, we go through the theory behind BGNLM, variational inference and normalizing flows. We also show the calculations needed to understand the new implementation, and provide a Python framework for training and testing BGNLMs. Through a series of applications we demonstrate that we are able to make accurate predictions, and get easily obtainable measures for the uncertainty of the predictions.