Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pretrain with MIMIC-CXR Val Loss #22

Open
SinuoWang opened this issue Jan 23, 2024 · 4 comments
Open

Pretrain with MIMIC-CXR Val Loss #22

SinuoWang opened this issue Jan 23, 2024 · 4 comments

Comments

@SinuoWang
Copy link

Hi,

I am pretraining the model with the MIMIC-CXR JPG dataset, but the local alignment validation loss is kinda weird. Is this normal when you pretrain GLoRIA with MIMIC?

Thanks in advance!!
loss

@GravitySaika
Copy link

I am training on this dataset too, but your val_loss much bertter than me. Can u share some of your steps of training on it?

@SinuoWang
Copy link
Author

I am training on this dataset too, but your val_loss much bertter than me. Can u share some of your steps of training on it?

Maybe you should check whether you modified the img transforms. It's good to use the tf that is specifically for MIMIC not Chexpert.

But you don't need to worry too much about the local val loss. I tested this model on some downstream tasks and got normal results, even the local val loss is super weird

@GravitySaika
Copy link

I am training on this dataset too, but your val_loss much bertter than me. Can u share some of your steps of training on it?

Maybe you should check whether you modified the img transforms. It's good to use the tf that is specifically for MIMIC not Chexpert.

But you don't need to worry too much about the local val loss. I tested this model on some downstream tasks and got normal results, even the local val loss is super weird

Sorry, I am not so familiar with these transforms. What is the tf means in your experiment. I also met some wired thing like the downstream tasks like binary classification on pnumonia troax always shows auroc=0.5 and any pic going through the model will become an all-none-tensor. The Adam lr never changed. I don't know is that normal.

@ZiyangZhang0511
Copy link

Hi,

I am pretraining the model with the MIMIC-CXR JPG dataset, but the local alignment validation loss is kinda weird. Is this normal when you pretrain GLoRIA with MIMIC?

Thanks in advance!! loss

I met the same issues as you. The lowest val loss is 3.4 but train loss can be optimized down to 0.09. This lead to quiet low performance of image-text retreival on test 5x200 dataset. Do you know how to improve R@K or P@K to achieve comparable results that were reported in papers? Thank you in advance.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants