-
Notifications
You must be signed in to change notification settings - Fork 109
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
problem when synthesize #10
Comments
Hi,can you upload your result wav |
raw: http://github.com/binyi10/-/blob/master/yibintest7.wav
sorry,my fault, you can copy these urls to your browser, or git clone http://github.com/binyi10/- to you pc, then you can listen the wave |
hello,instead of using ljspeech data ,I use my dataset, so I change the source code about some para(e.g. change the value of hop_length = 120), the results have been uploaded, do you meet the similar problem? |
How many audio samples are in your dataset? The scale of the loss function varies from dataset to dataset(VCTK, LJSpeech, ...). In the case of LJSpeech, it's near -4.5. |
Your wave maybe wrong and cannot be opened |
i use 30k samples of audio, but my cuda mem only support me use batch size = 4, so I think maybe it's suboptimal, I have try to change your code to multi-gpu, I use pytorch function DATAPARALLEL(model), and I change your code in train.py in line 110
but the loss may be wrong, and I syn the audio is just noise. |
Actually for the multi-GPU training there is subtle caveat. The |
The latest commit now supports the proper multi-GPU training. |
We'll close the issue, but feel free to re-open if the problem persists. FYI: please also refer to #13 . |
thanks for this nice job
but I have some promble when synthesize, there is always sound reverberation in synthesize audio compared with raw audio, does someone have same problem with me(batch size = 4,1000k step). I guess the "change order"module may lead in this problem?
after 200k step, I found loss almost remain unchanged in (-3.4,-3.7), and the result of synthesize is similar too from 200k to 1000k,so I want to ask it is reasonable? and if not, which scale of loss is reasonable
The text was updated successfully, but these errors were encountered: