Markus Heinonen from Aalto - Data Science
Date: - Location: Eurecom
Bayesian neural networks (BNNs) infer posterior distributions over neural networks that match both data and a prior. BNNs have demonstrated improved calibration, generalisation and robustness but at the cost of expensive inference and challenges in model specification. In this talk I will talk about an alternative low-rank formulation of neural network posteriors, and show how low-rank node-BNNs naturally mitigate these issues, and provide an avenue towards Bayes’ifying billion-parameter foundation models, such as GPT.