Replies: 2 comments
-
yes, I think that makes sense, have you looked into this ssl pretraining demo? https://github.com/Project-MONAI/tutorials/tree/main/self_supervised_pretraining |
Beta Was this translation helpful? Give feedback.
0 replies
-
Great, thank you for your reply! Not yet, I'll definitely delve into it, although the relative performance improvement column at the bottom looks good. I hope I'll be able to contribute to the project within a few months then. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
HI all!
I've stumbled upon MONAI lately and would like to share my idea with you.
I'm interested in latest ViT and SSL advancements and would like to write my master thesis on pretraining ViT-based SSL-trained backbone for processing CT images (mostly 2D so for individual slices). I would like to conduct a series of experiments and also share the resultant models within your model zoo. I hope such project would help in applying ML to downstream tasks as there are tons of unlabelled CTs (or labeled merely for a very specific task).
As far as I know there hasn't been anything like this yet (solely one paper but no implementation or checkpoints). I would love to hear your opinion. Does it make sense?
Beta Was this translation helpful? Give feedback.
All reactions