AABI 2021, 3rd Symposium on Advances in Approximate Bayesian Inference, January-February 2021 (Virtual Event)
The Bayesian treatment of neural networks dictates that a prior distribution is considered
over the weight and bias parameters of the network. The non-linear nature of the model
implies that any distribution of the parameters has an unpredictable effect on the distribution of the function output. Gaussian processes offer a rigorous framework to define prior distributions over the space of functions. Our proposal is to impose such functional priors on well-established architectures of neural networks by means of minimising the Wasserstein distance between samples of stochastic processes. Early experimental results demonstrate the potential of functional priors for Bayesian neural networks.
Type:
Conference
Date:
2021-01-13
Department:
Data Science
Eurecom Ref:
6445
Copyright:
© EURECOM. Personal use of this material is permitted. The definitive version of this paper was published in AABI 2021, 3rd Symposium on Advances in Approximate Bayesian Inference, January-February 2021 (Virtual Event) and is available at :
See also: