-
Notifications
You must be signed in to change notification settings - Fork 32
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Pretrain with MIMIC-CXR Val Loss #22
Comments
I am training on this dataset too, but your val_loss much bertter than me. Can u share some of your steps of training on it? |
Maybe you should check whether you modified the img transforms. It's good to use the tf that is specifically for MIMIC not Chexpert. But you don't need to worry too much about the local val loss. I tested this model on some downstream tasks and got normal results, even the local val loss is super weird |
Sorry, I am not so familiar with these transforms. What is the tf means in your experiment. I also met some wired thing like the downstream tasks like binary classification on pnumonia troax always shows auroc=0.5 and any pic going through the model will become an all-none-tensor. The Adam lr never changed. I don't know is that normal. |
I met the same issues as you. The lowest val loss is 3.4 but train loss can be optimized down to 0.09. This lead to quiet low performance of image-text retreival on test 5x200 dataset. Do you know how to improve R@K or P@K to achieve comparable results that were reported in papers? Thank you in advance. |
Hi,
I am pretraining the model with the MIMIC-CXR JPG dataset, but the local alignment validation loss is kinda weird. Is this normal when you pretrain GLoRIA with MIMIC?
Thanks in advance!!
The text was updated successfully, but these errors were encountered: