Localized Diffusion Models

English

Séminaire Données et Aléatoire Théorie & Applications

11/12/2025 - 14:00 Shuigen Liu Salle 106

Diffusion models are the state-of-the-art tools for various generative tasks, but training them requires estimating high-dimensional score functions, which in principle suffer from the curse of dimensionality. However, many real-world distributions exhibit locality structure, which describes sparse conditional dependencies that make the score function effectively low dimensional.

In this talk, I introduce localized diffusion models, which exploit the locality structure by learning the score within a localized hypothesis space using a localized score matching loss. We prove that such localization enables diffusion models to circumvent the curse of dimensionality, at the price of additional localization error. Under realistic sample size scaling, we show both theoretically and numerically that a moderate localization radius can balance the statistical and localization errors, yielding better overall performance. Localized structure also facilitates parallel training, making localized diffusion models potentially more efficient for large-scale applications.