Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The precision recall curve (and maybe others) should returned a thresholds object with the same size as Xaxis and Yaxis #2207

Closed
Yann-CV opened this issue Nov 9, 2023 · 3 comments
Labels
bug / fix Something isn't working help wanted Extra attention is needed

Comments

@Yann-CV
Copy link

Yann-CV commented Nov 9, 2023

🐛 Bug

The thresholds returned by the precision recall curve classes/functions (and maybe others) is missing the last value (threshold > max_prediction)

To Reproduce

pr_curve = PrecisionRecallCurve(task="binary")
precision, recall, thresholds = pr_curve(pred, target)
assert precision.size() == recall.size()  == thresholds.size()  ---------> fails

Expected behavior

pr_curve = PrecisionRecallCurve(task="binary")
precision, recall, thresholds = pr_curve(pred, target)
assert precision.size() == recall.size()  == thresholds.size()  ---------> does not fail

Environment

TorchMetrics 1.2

Additional context

This is an issue when you want to display the threshold value for all points in the curve (even though selecting this threshold makes no sense). It is also an issue when you want to create pandas.Dataframe from the returned tuple.

Quick fix on our side: we manually added a max_prediction + epsilon at the end of thresholds

@Yann-CV Yann-CV added bug / fix Something isn't working help wanted Extra attention is needed labels Nov 9, 2023
@Borda
Copy link
Member

Borda commented Nov 10, 2023

seems as the same issue discovered by @matsumotosan in #2154 (comment)

@SkafteNicki
Copy link
Member

This is intended behavior as it is consistent with sklearn:
image
https://scikit-learn.org/stable/modules/generated/sklearn.metrics.precision_recall_curve.html

@Yann-CV
Copy link
Author

Yann-CV commented Nov 24, 2023

@SkafteNicki sorry I did not noticed your answer... Then thanks.

If the point is to be sklearn compatible, then I guess it is a good point to follow their choice.

On my side, because I see Torchmetrics as a substitute to sklearn regarding metric computation, I still personally think this is a bad design choice.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug / fix Something isn't working help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

3 participants