This repository is on using two different approaches for doing neural style transfer (nst). Code is implemented and tested in Ubuntu system with Python 3.10. iCartoonFace Dataset is used for fast NST. More implementation detail can be found at this blog post.
Content image
minion1.jpg | minion2.jpg
Style image
Monet | Van Gogh | Picasso
Basic NST Output
a = 1 for all, b = 0.01, 100, 100000 from left to right
Fast NST Output
a = 1, b = 50000 for all models
a = 1, b = 50000 for all models
Clone the project
git clone https://github.com/lihanlian/basic-and-fast-neural-style-transfer
Go to project directory
python3 -m venv env && source env/bin/activate
pip install -r requirements.txt
- run basic_nst.py to generate images using the basic neural style transfer algorithm.
- run fast_nst_training.py to train the feedforward convolutional neural network using fast neural style transfer algorithm.
- run fast_nst_inference.py to generate images that produced by trained model from fast_nst_train.py.
- fast_nst_transformer_net.py defines the neural network that is trained and later used for inference (generate images).
- fast_nst_vgg_net.py defines the neural network (based on pre-trained VGG19) for calculating the loss function during training.
- Gatys, Leon A., Alexander S. Ecker, and Matthias Bethge. "A Neural Algorithm of Artistic Style." arXiv preprint arXiv:1508.06576 (2015) (Basic NST Paper)
- Neural Style Transfer (NST) — theory and implementation (Basic NST Blog Post)
- Johnson, Justin and Alahi, Alexandre and Li, Fei-Fei. "Perceptual losses for real-time style transfer and super-resolution." European Conference on Computer Vision. 2016 (Fast NST Paper)
- pytorch/examples/fast_neural_style (Fast BST Implementation)
- Downsampling and Upsampling of Images — Demystifying the Theory
- iCartoonFace Dataset