-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement TorchIO transforms wrapper analogous to TorchVision transfo… #7579
Implement TorchIO transforms wrapper analogous to TorchVision transfo… #7579
Conversation
…rms wrapper and test case Signed-off-by: Fabian Klopfer <[email protected]>
0af6b6b
to
2b37b94
Compare
Signed-off-by: Fabian Klopfer <[email protected]>
Not sure where I have to add the torchio dependency to make CI pass. |
Signed-off-by: Fabian Klopfer <[email protected]>
Hi @SomeUserName1, you can add a skip wrapper like this: MONAI/tests/test_patch_wsi_dataset.py Line 184 in c86e790
Thanks. |
Signed-off-by: Fabian Klopfer <[email protected]>
Signed-off-by: Fabian Klopfer <[email protected]>
…Name1/MONAI into 7499-torchio-transforms-wrapper
One test failed, i have no idea why as the test and corresponding code is nothing that I've touched
|
Looks like a random error, fixed in this commit. |
@KumoLiu only the blossom-ci check hasn't passed. Can we merge? |
Hi @SomeUserName1, thanks for the PR! We might consider incorporating this new feature after the 1.3.1 release due to the new requirements included in this PR. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I found this by accident, so first of all: Really nice, I'd love to be able to easily use some of the transforms in torchio, and I think it helps the available transforms there gain visibility.
While looking at this, I had a few thoughts:
- It would be nice to have this available for the dictionary versions as well (like torchvisiond)
- The transform does currently not implement the RandomizableTrait, so I'm not sure how this would interact with CacheDatasets etc. I think it might help to have a version with the RandomizableTrait for all Random transforms (it seems like these are easy to find since they start with "Random") and one without for the rest. I'm not sure how to best design this, but one could maybe use the
__new__()
method to select the correct one based on the name. Alternatively an error/warning could be thrown if the wrong version is used.
Implemented.
They also have a common RandomTransform base class in tio, which differs from MONAIs base RandomTransform. Thus, I add the RandomizableTrait as ABC to the TorchIO wrapper for all transforms. I'm not sure if I should add Randomizable or RandomTranform as well as this is implemented in torchio already. So adding these two would be clearer in terms of the capabilities of these transforms but require manipulating TorchIO objects/class instances in MONAI code introducing complexity with little practical value (probabilistic application can be done via the tio parameters; is there a benefit in adding the same functionality via MONAI with the different concepts in mind?) |
…sses to TorchIO and Transform to TorchVision. document conversion for torchvision and remove conversion for torchio. Add dtypes to docstring for tio Signed-off-by: Fabian Klopfer <[email protected]>
…wrapper Signed-off-by: Fabian Klopfer <[email protected]>
…orms-wrapper' into 7499-torchio-transforms-wrapper Signed-off-by: Fabian Klopfer <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you very much!
Regarding the p argument. One could potentially use the Monai RandomizableTransform and it's prob parameter, and then force p=1 to only use the monai code. The only benefit of this in my opinion is a slightly easier understandable interface on the MONAI side, and that the randomstate from Monai is used. But I'm not sure it's worth it / necessary. As a intermediate way, you could consider rerouting the "prob" parameter to "p" as well (ensuring only one of them is set at any time)
Signed-off-by: Fabian Klopfer <[email protected]>
…ct. add tags file generated by ctags to gitignore, make `prob` and `p` kwargs mutually exclusive for TorchIO transforms and initialize `p` with `prob` if the latter was provided. Add test cases for applying the same transform Signed-off-by: Fabian Klopfer <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry for the delayed review; the point release is finally done. I have some concerns about the support for randomness, leave the comments inline.
@ericspod could you please also help review the PR? Thanks!
Co-authored-by: YunLiu <[email protected]> Signed-off-by: Fabian Klopfer <[email protected]>
Hi @SomeUserName1, apologies for overlooking this PR. I’ll review it and aim to have it merged by the end of the week. Thanks. |
Hi @SomeUserName1 I think it still stands that there's an issue of randomness. We can't change whether a transform is considered random or not at runtime since it's type-based, so either we have a random and non-random version of the transforms or state in the docstring to use only randomised TorchIO transforms. If you're good with the latter we can merge things sooner after a little review. |
No worries! I agree that the randomness issue still stands and will implement the latter version.
|
…as well Signed-off-by: Fabian Klopfer <[email protected]>
Signed-off-by: Fabian Klopfer <[email protected]>
…as well Signed-off-by: Fabian Klopfer <[email protected]>
Signed-off-by: Fabian Klopfer <[email protected]>
Signed-off-by: Fabian Klopfer <[email protected]>
Signed-off-by: Fabian Klopfer <[email protected]>
…annotations Signed-off-by: Fabian Klopfer <[email protected]>
Signed-off-by: Fabian Klopfer <[email protected]>
Signed-off-by: Fabian Klopfer <[email protected]>
Signed-off-by: Fabian Klopfer <[email protected]>
…Name1/MONAI into 7499-torchio-transforms-wrapper
Signed-off-by: Fabian Klopfer <[email protected]>
Signed-off-by: Fabian Klopfer <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this is good now, thanks!
/build |
…rms wrapper and test case
Fixes #7499 .
Description
As discussed in the issue, this PR implements a wrapper class for TorchIO transforms, analogous to the TorchVision transforms wrapper.
The test cases just check that transforms are callable and that after applying a transform, the result is different from the inputs.
Types of changes
./runtests.sh -f -u --net --coverage
../runtests.sh --quick --unittests --disttests
.make html
command in thedocs/
folder.