Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Help fine-tuning on other CT data #39

Open
AgustinaLaGreca opened this issue Apr 28, 2021 · 3 comments
Open

Help fine-tuning on other CT data #39

AgustinaLaGreca opened this issue Apr 28, 2021 · 3 comments

Comments

@AgustinaLaGreca
Copy link

Hello @MrGiovanni, thanks a lot for your work! It is very cool and interesting.

I am working on image classification (disease positive vs disease negative) in CT of head and neck cancer. The problem is that, as you state, the genesis chest model was trained on the -1000 HU to 1000 HU, and my interesting range would be much less (approx -50 HU to 200 HU). My dataset size is rather limited (less than 600 patients) so that is why I think a pre-trained model is a good approach but I was wondering if you have any advice on this problem or any data normalization technique. Thanks a lot in advance,

Best,

Agustina

@MrGiovanni
Copy link
Owner

Hi @AgustinaLaGreca

Thanks for reaching out. If your interesting range is -50 to 200 HU, I think it is still okay to use Models Genesis because we observe performance gain in CTPA and FLAIR imaging modalities, which are very different from the original CT intensity values.

Just make sure you normalize the input between [0, 1] by

hu_min, hu_max = -50, 200
image[image < hu_min] = hu_min
image[image > hu_max] = hu_max
image = (image - hu_min) / (hu_max - hu_min)

Thanks,

Zongwei

@AgustinaLaGreca
Copy link
Author

Thanks a lot for the super fast reply @MrGiovanni !

Ok then I was on the right direction with the normalization. Also can I ask you if you have any tips on hyper-parameter search for fine-tuning? (size of fully-connected layers, number of frozen layers, etc.) Thanks a lot in advance!!

@MrGiovanni
Copy link
Owner

Hi @AgustinaLaGreca

Sorry for the delay. No, I don't have a fancy hyperparameter search or optimizer for fine-tuning. Everything is the default in keras/pytorch setup ('Adam'). Every layer is trainable. The size of fully connected layers can be found at https://github.com/MrGiovanni/ModelsGenesis/tree/master/keras#3-fine-tune-models-genesis-on-your-own-target-task

Thanks,

Zongwei

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants