Skip to content

Repo of different generative modeling algorithms for unconditional mnist generation.

Notifications You must be signed in to change notification settings

ethanherron/genmod

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

51 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Generative Modeling Algorithms for Unconditional MNIST Generation

This repository serves as an educational resource, providing implementations of various generative modeling algorithms for unconditional image generation on the MNIST dataset. Each algorithm is implemented in a separate trainer class within the trainers module, allowing for easy comparison and study of different approaches.

Repository Outline

  • networks/: Contains model architectures used by the trainers.
  • trainers/: Implements various generative modeling algorithms for MNIST generation.
  • main.py: Script to run training and sampling for selected models.
  • README.md: This file, providing an overview and citations.

Implemented Algorithms

1. Denoising Diffusion Probabilistic Models (DDPM)

  • Trainer: trainers/ddpm.py
  • Citation: Ho, J., Jain, A., & Abbeel, P. (2020). Denoising Diffusion Probabilistic Models. arXiv preprint arXiv:2006.11239. arXiv

2. Elucidating the Design Space of Diffusion-Based Generative Models (EDM)

  • Trainer: trainers/edm.py
  • Citation: Karras, T., Aittala, M., Aila, T., & Laine, S. (2022). Elucidating the Design Space of Diffusion-Based Generative Models. arXiv preprint arXiv:2206.00364. arXiv

3. Rectified Flow (RF)

  • Trainer: trainers/rf.py
  • Citation: Liu, X., Gong, C., & Liu, Q. (2023). Flow Straight and Fast: Learning to Generate and Transfer Data with Rectified Flow. arXiv preprint arXiv:2209.03003. arXiv

4. Variational Rectified Flow (VRF)

  • Trainer: trainers/vrf.py
  • Citation: Guo, P., & Schwing, A.G. (2025). Variational Rectified Flow. arXiv preprint arXiv:2502.09616. arXiv

5. PFGM++ (PFGMpp)

  • Trainer: trainers/pfgmpp.py
  • Citation: Xu, Y., Liu, Z., Tian, Y., Tong, S., Tegmark, M., & Jaakola, T. (2023). PFGM++: Unlocking the Potential of Physics-Inspired Generative Models. arXiv preprint arXiv:2206.06910. arXiv

6. Consistency Models (CM)

  • Trainer: trainers/cm.py
  • Citation: Song, Y., Dhariwal, P., Chen, M., & Sutskever, I. (2023). Consistency Models. arXiv preprint arXiv:2303.01469. arXiv

7. Cold Diffusion (CD)

  • Trainer: trainers/cd.py
  • Citation: Bansal, A., Borgnia, E., Chu, H.M., Li, J.S., Kazemi, H., Huang, F., Goldblum, M., Geiping, J., & Goldstein, T. (2022). Cold Diffusion: Inverting Arbitrary Image Transforms Without Noise. arXiv preprint arXiv:2208.09392. arXiv

8. Variational Diffusion Models (VDM)

  • Trainer: trainers/vdm.py
  • Citation: Kingma, D. P., Salimans, T., Poole, B., & Ho, J. (2021). Variational Diffusion Models. arXiv preprint arXiv:2107.00630. arXiv

Usage

To run the training and sampling for a specific model:

python main.py --model <model_name> [additional arguments]

Replace <model_name> with one of the implemented algorithms (e.g., ddpm, edm, rf, etc.). Use --help to see all available options.

Code References

Notes

This repository was built for personal use and for teaching a section on generative modeling. A few bugs crept in while refactoring so if you happen to find any, or see any errors, please let me know!


This README was generated with assistance from Claude-3.5-Sonnet

About

Repo of different generative modeling algorithms for unconditional mnist generation.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages