-
Notifications
You must be signed in to change notification settings - Fork 19
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
First data type schemas #524
base: v5
Are you sure you want to change the base?
Conversation
@Peyman-N, @apdavison and @lzehl I will keep it as a draft for now, but the content is ready for review. Except for one new ControlledTerms terminology*, everything should be functional in the existing framework. Meaning the coordinateSpace still points to the old setup. This will need to be updated when we do the SANDS updates. I restructured the content from the issue since it seems like several image types share properties (which makes sense). The schemas are structured the following way:
Looking forward to discussing this draft 🙂 |
to be further discussed: 2Draster compare to: |
Thanks @lzehl for pointing me to this PR. Apologies that I only have a chance to be involved now, and now sooner. I do have some questions to everyone involved:
We currently work with the outputs of voluba (3 dimensional raster based image) and quicknii (2 dimensional raster based image). In both cases, through user interaction, a transform file is often produced (in the case of voluba, a 4x4 affine, in the case of quicknii, if I understand correctly, effectively a 3x3 matrix) In the current proposal
In such a case, the relationship between the source (e.g. I want to point out the effort done by the ngff v0.5[1] (which is an extension to zarr v3[2]), on how they handle the pixel/voxel --> physical unit problem[3]. They allow the definition of a list of transformation steps, which at the moment include scaling (not dissimilar to this proposal) and translation. In an array form, it lends itself to extension, should the initial proposal does not cover edge cases (e.g. rotation can be introduced via an additional transform type, nonlinear transform too) Whilst ngff (and by extension, zarr) is mainly a data format primarly used to store raster 2d/3d image data, its approach to inform the client about how to transform from one space to another seems flexible. edit: ping @UlrikeS91 @Peyman-N @lzehl [1] https://ngff.openmicroscopy.org/latest/ [2] https://zarr-specs.readthedocs.io/en/latest/v3/core/v3.0.html |
@xgui3783 thanks for getting involved here. Your input is needed and greatly appreciated. I will respond more in detail later but here already something that might clarify a bit our approach (so far) These data type schemas suggested here would indeed not include anchoring information. It is not there purpose. Anchoring information would be stored in a separate schema (because there could be multiple to different coordinate spaces associated with the same file) which we should further discuss. The coordinate space in the here provided schemas will always refer to the coordinate space of the file (so the saved transformation result) not to the coordinate space a file could be transformed to (with the given transformation information). I would suggest to have a dedicated meeting for how we can extend SANDS with transformation activities and the respective results. We discussed this multiple times in the past but always dropped it for time and complexity reasons. @openMetadataInitiative/openminds-developers and @xgui3783 please provide additional comments/input to further shape this PR and related issues. |
Do I understand correctly then, the following examples should have the corresponding
But the following should not :
|
adjusted draft of #523