EcoSta 2024: Start Registration
View Submission - EcoSta2024
A0913
Title: Score identity distillation: Exponentially fast distillation of pretrained diffusion models for one-step generation Authors:  Mingyuan Zhou - University of Texas at Austin (United States) [presenting]
Abstract: Score identity distillation (SiD), an innovative data-free method, is introduced to distil the generative capabilities of pre-trained diffusion models into a single-step generator. SiD facilitates an exponentially fast reduction in Frechet inception distance (FID) during distillation and approaches or exceeds the FID performance of the original teacher diffusion models. By reformulating forward diffusion processes as semi-implicit distributions, three score-related identities are leveraged to create an innovative loss mechanism. This mechanism achieves rapid FID reduction by training the generator using its own synthesized images, eliminating the need for real data or reverse-diffusion-based generation, all accomplished within a significantly shortened generation time. Upon evaluation across four benchmark datasets, the SiD algorithm demonstrates high iteration efficiency during distillation and surpasses competing distillation approaches, whether they are one-step or few-step, data-free, or dependent on training data in terms of generation quality. This achievement not only redefines the benchmarks for efficiency and effectiveness in diffusion distillation but also in the broader field of diffusion-based generation.