Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can't solve the issue that 'nan's keep appearing in Loss if the parameter in train.py is greater than 1.96 #20

Open
Holowila opened this issue May 31, 2024 · 1 comment

Comments

@Holowila
Copy link

If I set the args.nonlinearityLTP or args.nonlinearityLTD greater than 1.96, the 'nan's will keep appearing in Loss at training phase and it will report an error while converting the decimal data to binary data in hook.py.

First I tried to adjust the learning rate and it didn't work. Then I tried to add normalization before the converting but it didn't work either. I don't know when the 'nan's appear and how to fix it.
nan

@uhvardhan
Copy link

Hi! Were you able to fix this issue? I face the same thing in some situations and wasn't able to figure out a solution.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants