Scale-wise Distillation SDXL
Scale-wise Distillation (SwD) is a novel framework for accelerating diffusion models (DMs)
by progressively increasing spatial resolution during the generation process.
SwD achieves significant speedups (2.5× to 10×) compared to full-resolution models
while maintaining or even improving image quality.

Project page: https://yandex-research.github.io/swd
GitHub: https://github.com/yandex-research/swd
Demo: https://huggingface.co/spaces/dbaranchuk/Scale-wise-Distillation
Usage
Upgrade to the latest version of the 🧨 diffusers and 🧨 peft
pip install -U diffusers
pip install -U peft
and then you can run
import torch
from diffusers import DDPMScheduler, StableDiffusionXLPipeline
from peft import PeftModel
pipe = StableDiffusionXLPipeline.from_pretrained(
"stabilityai/stable-diffusion-xl-base-1.0",
torch_dtype=torch.float16,
custom_pipeline="quickjkee/swd_pipeline_sdxl",
).to("cuda")
pipe.scheduler = DDPMScheduler.from_pretrained(
"stabilityai/stable-diffusion-xl-base-1.0",
subfolder="scheduler",
)
lora_path = "yresearch/swd-sdxl"
pipe.unet = PeftModel.from_pretrained(
pipe.unet,
lora_path,
)
prompt = "Cute winter dragon baby, kawaii, Pixar, ultra detailed, glacial background, extremely realistic."
sigmas = [1.0000, 0.8000, 0.6000, 0.4000, 0.0000]
scales = [64, 80, 96, 128]
image = pipe(
prompt,
timesteps=torch.tensor(sigmas) * 1000,
scales=scales,
height=1024,
width=1024,
guidance_scale=1.0,
).images[0]
Citation
@inproceedings{
starodubcev2026scalewise,
title={Scale-wise Distillation of Diffusion Models},
author={Nikita Starodubcev and Ilya Drobyshevskiy and Denis Kuznedelev and Artem Babenko and Dmitry Baranchuk},
booktitle={The Fourteenth International Conference on Learning Representations},
year={2026},
url={https://openreview.net/forum?id=Z06LNjqU1g}
}
- Downloads last month
- 8
Model tree for yresearch/swd-sdxl
Base model
stabilityai/stable-diffusion-xl-base-1.0