This project is the official implementation of our “BiDM: Pushing the Limit of Quantization for Diffusion Models”. [PDF]
Establish a virtual environment and install dependencies as referred to latent-diffusion.
- Replace the existing
main.py
in the LDM with our version ofmain.py
. - Place
openaimodel_ours.py
andutil_ours.py
in the directory./ldm/modules/diffusionmodules
. - Place
ddpm_ours.py
andddim_ours.py
in the directory./ldm/models/diffusion
- run
bash train.sh
- Results for LDM in unconditional generation by DDIM with 100 steps.
- Samples generated by the binarized DM baseline and BiDM under W1A1 bit-width.
- Our codebase builds on latent-diffusion and stable-diffusion. Thanks for open-sourcing!
If you find BinaryDM is useful and helpful to your work, please kindly cite this paper:
@inproceedings{zhengbidm,
title={BiDM: Pushing the Limit of Quantization for Diffusion Models},
author={Zheng, Xingyu and Liu, Xianglong and Bian, Yichen and Ma, Xudong and Zhang, Yulun and Wang, Jiakai and Guo, Jinyang and Qin, Haotong},
booktitle={The Thirty-eighth Annual Conference on Neural Information Processing Systems}
}