Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

added link_data --> clear_cache relationship to support repacking zarr nwbfiles #215

Merged
merged 7 commits into from
Oct 31, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@
* NWBZarrIO load_namespaces=True by default. @mavaylon1 [#204](https://github.com/hdmf-dev/hdmf-zarr/pull/204)
* Added test for opening file with consolidated metadata from DANDI. @mavaylon1 [#206](https://github.com/hdmf-dev/hdmf-zarr/pull/206)
* Add dimension labels compatible with xarray. @mavaylon1 [#207](https://github.com/hdmf-dev/hdmf-zarr/pull/207)
* Added link_data --> clear_cache relationship to support repacking zarr nwbfiles: [#215](https://github.com/hdmf-dev/hdmf-zarr/pull/215)

## 0.8.0 (June 4, 2024)
### Bug Fixes
Expand Down
9 changes: 9 additions & 0 deletions src/hdmf_zarr/backend.py
Original file line number Diff line number Diff line change
Expand Up @@ -362,6 +362,8 @@ def export(self, **kwargs):
write_args['export_source'] = src_io.source # pass export_source=src_io.source to write_builder
ckwargs = kwargs.copy()
ckwargs['write_args'] = write_args
if not write_args.get('link_data', True):
ckwargs['clear_cache'] = True
super().export(**ckwargs)
if cache_spec:
self.__cache_spec()
Expand Down Expand Up @@ -1305,6 +1307,13 @@ def __list_fill__(self, parent, name, data, options=None): # noqa: C901
except ValueError:
for i in range(len(data)):
dset[i] = data[i]
except TypeError: # If data is an h5py.Dataset with strings, they may need to be decoded
for c in np.ndindex(data_shape):
o = data
for i in c:
o = o[i]
# bytes are not JSON serializable
dset[c] = o if not isinstance(o, (bytes, np.bytes_)) else o.decode("utf-8")
return dset

def __scalar_fill__(self, parent, name, data, options=None):
Expand Down
2 changes: 1 addition & 1 deletion tests/unit/test_io_convert.py
Original file line number Diff line number Diff line change
Expand Up @@ -949,7 +949,7 @@ def __get_data_array(self, foo_container):

def test_maxshape(self):
"""test when maxshape is set for the dataset"""
data = H5DataIO(data=list(range(5)), maxshape=(None,))
data = H5DataIO(data=list(range(5)), maxshape=(5,))
self.__roundtrip_data(data=data)
self.assertContainerEqual(self.out_container, self.read_container, ignore_hdmf_attrs=True)

Expand Down
Loading