Generative Models (GMs) have attracted considerable attention due to their tremendous success in various domains, such as computer vision where they are capable to generate impressive realistic-looking images. Likelihood-based GMs are attractive due to the possibility to generate new data by a single model evaluation. However, they typically achieve lower sample quality compared to state-of-theart score-based Diffusion Models (DMs). This paper provides a significant step in the direction of addressing this limitation. The idea is to borrow one of the strengths of score-based DMs, which is the ability to perform accurate density estimation in low-density regions and to address manifold overfitting by means of data mollification. We connect data mollification through the addition of Gaussian noise to Gaussian homotopy, which is a well-known technique to improve optimization. Data mollification can be implemented by adding one line of code in the optimization loop, and we demonstrate that this provides a boost in generation quality of likelihood-based GMs, without computational overheads. We report results on image data sets with popular likelihood-based GMs, including variants of variational autoencoders and normalizing flows, showing large improvements in FID score.
One-line-of-code data mollification improves optimization of likelihood-based generative models
NeurIPS 2023, 37th Conference on Neural Information Processing Systems, 11-16 December 2023, New Orleans, USA
Type:
Conference
City:
New Orleans
Date:
2023-12-11
Department:
Data Science
Eurecom Ref:
7320
Copyright:
© NIST. Personal use of this material is permitted. The definitive version of this paper was published in NeurIPS 2023, 37th Conference on Neural Information Processing Systems, 11-16 December 2023, New Orleans, USA and is available at :
See also:
PERMALINK : https://www.eurecom.fr/publication/7320