This talk presents a mathematical framework linking stochastic differential equations and information theory on how diffusion models utilize latent abstractions via implicit stochastic filtering to synthesize high-dimensional data. Diffusion-based generative models have achieved remarkable success in synthesizing high-dimensional data, yet a key question remains: how do they encode and leverage latent abstractions to guide the generative process? In this talk, we will provide an introduction to a mathematical framework that connects stochastic differential equations (SDEs) and information theory to address this question. The core idea is that diffusion models implicitly perform a form of stochastic filtering, where an evolving latent state steers the dynamics of an observable process. Our discussion will highlight how this formalism sheds light on several fundamental mechanisms underlying generative models and offers a promising avenue for future research.