Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Changing upscale factor? #24

Open
MasterScrat opened this issue Jan 10, 2019 · 5 comments
Open

Changing upscale factor? #24

MasterScrat opened this issue Jan 10, 2019 · 5 comments

Comments

@MasterScrat
Copy link

MasterScrat commented Jan 10, 2019

Is there any easy way to tune the upscale=4 factor? would that need a specifically trained network?

@xinntao
Copy link
Owner

xinntao commented Jan 14, 2019

Hi @MasterScrat
Yes, need to train/fine-tune specifical networks for other factors like x2, x3.

Maybe we can first upsample by 4, then use some easy cubic or bilinear downsampling methods, but I am not sure the final results will be OK.

@MasterScrat
Copy link
Author

I see. How much time and hardware resources were necessary to train RRDB_PSNR_x4?

@xinntao
Copy link
Owner

xinntao commented Jan 17, 2019

@MasterScrat
I used a Titan XP and it cost about one week. really slow...

You may speed up by multiple-GPU training.

@DeltaDesignRus
Copy link

DeltaDesignRus commented May 17, 2019

@xinntao
Is it possible to use scale x1?
I want to train the network to remove artifacts of compression jpg without scale.

@xinntao
Copy link
Owner

xinntao commented May 19, 2019

@DeltaDesignRus
Though it is possible to use scale x1 (remove the upsampling layers), it is recommended to use models that are designed for compression artifact removal.
There may be some better designs for the models of compression artifact removal, like U-Net for speeding up.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants