Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Perplexity dtype restriction too strict #2224

Closed
ZhaofengWu opened this issue Nov 19, 2023 · 0 comments · Fixed by #2235
Closed

Perplexity dtype restriction too strict #2224

ZhaofengWu opened this issue Nov 19, 2023 · 0 comments · Fixed by #2235
Assignees
Labels
bug / fix Something isn't working help wanted Extra attention is needed v1.2.x

Comments

@ZhaofengWu
Copy link

🐛 Bug

The perplexity metric requires the input dtype to be either fp32 or fp64, but this doesn't work with e.g. fp16, and users need to manually recast.

_TORCH_FLOAT_OR_DOUBLE = (torch.float32, torch.float64)

Expected behavior

The metric should accept other floating point dtypes.

@ZhaofengWu ZhaofengWu added bug / fix Something isn't working help wanted Extra attention is needed labels Nov 19, 2023
@Borda Borda added the v1.2.x label Nov 20, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug / fix Something isn't working help wanted Extra attention is needed v1.2.x
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants