-
Notifications
You must be signed in to change notification settings - Fork 56
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bug/992 precision loss #993
base: main
Are you sure you want to change the base?
Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #993 +/- ##
==========================================
- Coverage 92.26% 92.24% -0.02%
==========================================
Files 84 84
Lines 12447 12454 +7
==========================================
+ Hits 11484 11488 +4
- Misses 963 966 +3
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
This pull request is stale because it has been open for 60 days with no activity. |
This pull request is stale because it has been open for 60 days with no activity. |
This pull request is stale because it has been open for 60 days with no activity. |
Thank you for the PR! |
Thank you for the PR! |
Description
Fixed precision loss in several functions when using
float64
data type.Issue/s resolved: #992
Changes proposed:
dtype
argument to backend torch functions instead of converting to desired data type after using default torch data type.Type of change
Due Diligence
Does this change modify the behaviour of other functions? If so, which?
no
skip ci