You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have two questions about the UNet architecture and Dice loss function:
In the UNet paper, there is no Relu layer after the final conv. In experiments of my implemented version of UNet, such a Relu doesn't help. I don't whether the Relu in Line 136, unet.py helps or not.
IMHO, Dice loss can be regarded as # of True Positives / (# of Positives + # of False Positives), but the Dice loss in Line 231, unet.py count both positives and negatives if I understand the code correctly.
Please correct me if I am wrong. Thanks.
The text was updated successfully, but these errors were encountered:
I think you're right. The code diverges from the paper architecture. There is no particular reason why I added this. ATM I'm not sure how it affects the performance.
Someone send a PR addressing this. Would love to hear your thoughts on this.
Hello,
I have two questions about the UNet architecture and Dice loss function:
# of True Positives / (# of Positives + # of False Positives)
, but the Dice loss in Line 231, unet.py count both positives and negatives if I understand the code correctly.Please correct me if I am wrong. Thanks.
The text was updated successfully, but these errors were encountered: