Latent abstractions, mutual information and generative diffusion models

Franzese, Giulio
Invited talk at KAUST (King Abdullah University of Science and Technology), 23 March 2025, Thuwal, Saudi Arabia

 

This talk presents a mathematical framework linking stochastic differential equations and information theory on how diffusion models utilize latent abstractions via implicit stochastic filtering to synthesize high-dimensional data. Diffusion-based generative models have achieved remarkable success in synthesizing high-dimensional data, yet a key  question remains: how do they encode and leverage latent abstractions to guide the generative process? In this talk, we will provide an introduction to a mathematical framework that connects stochastic differential equations (SDEs) and information theory to address this question. The core idea is that diffusion models implicitly perform a form of stochastic filtering, where an evolving latent state steers the dynamics of an observable process. Our discussion will highlight how this formalism sheds light on several fundamental mechanisms underlying generative models and offers a promising avenue for future research.

 

 


Type:
Talk
City:
Thuwal
Date:
2025-03-23
Department:
Data Science
Eurecom Ref:
8162
Copyright:
© EURECOM. Personal use of this material is permitted. The definitive version of this paper was published in Invited talk at KAUST (King Abdullah University of Science and Technology), 23 March 2025, Thuwal, Saudi Arabia and is available at :
See also:

PERMALINK : https://www.eurecom.fr/publication/8162