Skip to content

Commit

Permalink
Prepare for version 2.5.0 release (#550)
Browse files Browse the repository at this point in the history
  • Loading branch information
rly authored Apr 22, 2021
1 parent 4a5e6b8 commit 5d903ab
Show file tree
Hide file tree
Showing 13 changed files with 244 additions and 50 deletions.
65 changes: 53 additions & 12 deletions .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,8 @@ references:
name: Run the tests
command: |
. ../venv/bin/activate
if [[ "${TEST_TOX_ENV}" == "py39" ]]; then
# Installing HDF5 is required for building h5py 2.10 wheels for Python 3.9
if [[ "${TEST_TOX_ENV}" == *"py39"* ]]; then
sudo apt-get install libhdf5-dev
fi
pip install tox
Expand Down Expand Up @@ -134,6 +135,10 @@ references:
name: Run the gallery tests
command: |
. ../venv/bin/activate
# Installing HDF5 is required for building h5py 2.10 wheels for Python 3.9
if [[ "${TEST_TOX_ENV}" == *"py39"* ]]; then
sudo apt-get install libhdf5-dev
fi
pip install tox
tox -e $TEST_TOX_ENV
no_output_timeout: 30m
Expand Down Expand Up @@ -207,6 +212,22 @@ jobs:
- TEST_WHEELINSTALL_ENV: "wheelinstall"
<<: *ci-steps

python39-upgrade-dev:
<<: *py39
environment:
- TEST_TOX_ENV: "py39-upgrade-dev"
- BUILD_TOX_ENV: "build-py39-upgrade-dev"
- TEST_WHEELINSTALL_ENV: "wheelinstall"
<<: *ci-steps

python39-upgrade-dev-pre:
<<: *py39
environment:
- TEST_TOX_ENV: "py39-upgrade-dev-pre"
- BUILD_TOX_ENV: "build-py39-upgrade-dev-pre"
- TEST_WHEELINSTALL_ENV: "wheelinstall"
<<: *ci-steps

python36-min-req:
<<: *py36
environment:
Expand Down Expand Up @@ -287,6 +308,18 @@ jobs:
- TEST_TOX_ENV: "gallery-py38-upgrade-dev-pre"
<<: *gallery-steps

gallery39-upgrade-dev:
<<: *py39
environment:
- TEST_TOX_ENV: "gallery-py39-upgrade-dev"
<<: *gallery-steps

gallery39-upgrade-dev-pre:
<<: *py39
environment:
- TEST_TOX_ENV: "gallery-py39-upgrade-dev-pre"
<<: *gallery-steps

gallery36-min-req:
<<: *py36
environment:
Expand Down Expand Up @@ -376,18 +409,18 @@ workflows:
jobs:
- flake8:
<<: *no_filters
- python38:
<<: *no_filters
- python36-min-req:
<<: *no_filters
- python38:
<<: *no_filters
- miniconda36:
<<: *no_filters
- miniconda38:
<<: *no_filters
- gallery38:
<<: *no_filters
- gallery36-min-req:
<<: *no_filters
- gallery38:
<<: *no_filters
- deploy-dev:
requires:
- flake8
Expand Down Expand Up @@ -442,15 +475,21 @@ workflows:
<<: *no_filters
- python36:
<<: *no_filters
- python36-min-req:
<<: *no_filters
- python37:
<<: *no_filters
- python38:
<<: *no_filters
- python38-upgrade-dev:
<<: *no_filters
- python38-upgrade-dev-pre:
<<: *no_filters
- python39:
<<: *no_filters
- python38-upgrade-dev:
- python39-upgrade-dev:
<<: *no_filters
- python36-min-req:
- python39-upgrade-dev-pre:
<<: *no_filters
- miniconda36:
<<: *no_filters
Expand All @@ -462,17 +501,19 @@ workflows:
<<: *no_filters
- gallery36:
<<: *no_filters
- gallery36-min-req:
<<: *no_filters
- gallery37:
<<: *no_filters
- gallery38:
<<: *no_filters
- gallery39:
<<: *no_filters
- gallery38-upgrade-dev:
<<: *no_filters
- gallery36-min-req:
- gallery38-upgrade-dev-pre:
<<: *no_filters
- python38-upgrade-dev-pre:
- gallery39:
<<: *no_filters
- gallery38-upgrade-dev-pre:
- gallery39-upgrade-dev:
<<: *no_filters
- gallery39-upgrade-dev-pre:
<<: *no_filters
19 changes: 15 additions & 4 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,16 +1,25 @@
# HDMF Changelog

## HDMF 2.5.0 (Upcoming)
## HDMF 2.5.0 (April 5, 2021)

### New features
- `DynamicTable` can be automatically generated using `get_class`. Now the HDMF API can read files with extensions
that contain a `DynamicTable` without needing to import the extension first. @rly and @bendichter (#536)
- Add `HDF5IO.get_namespaces(path=path, file=file)` method which returns a dict of namespace name mapped to the
namespace version (the largest one if there are multiple) for each namespace cached in the given HDF5 file.
@rly (#527)
- Add experimental namespace to HDMF common schema. New data types should go in the experimental namespace
(hdmf-experimental) prior to being added to the core (hdmf-common) namespace. The purpose of this is to provide
a place to test new data types that may break backward compatibility as they are refined. @ajtritt (#545)
- Use HDMF common schema 1.5.0.
- Add experimental namespace to HDMF common schema. New data types should go in the experimental namespace
(hdmf-experimental) prior to being added to the core (hdmf-common) namespace. The purpose of this is to provide
a place to test new data types that may break backward compatibility as they are refined. @ajtritt (#545)
- `ExternalResources` was changed to support storing both names and URIs for resources. @mavaylon (#517, #548)
- The `VocabData` data type was replaced by `EnumData` to provide more flexible support for data from a set of
fixed values.
- Added `AlignedDynamicTable`, which defines a `DynamicTable` that supports storing a collection of sub-tables.
Each sub-table is itself a `DynamicTable` that is aligned with the main table by row index. Each sub-table
defines a sub-category in the main table effectively creating a table with sub-headings to organize columns.
- See https://hdmf-common-schema.readthedocs.io/en/latest/format_release_notes.html#april-19-2021 for more
details.
- Add `EnumData` type for storing data that comes from a fixed set of values. This replaces `VocabData` i.e.
`VocabData` has been removed. `VocabData` stored vocabulary elements in an attribute, which has a size limit.
`EnumData` now stores elements in a separate dataset, referenced by an attribute stored on the `EnumData` dataset.
Expand All @@ -24,6 +33,7 @@

### Internal improvements
- Update CI and copyright year. @rly (#523, #524)
- Refactor class generation code. @rly (#533, #535)
- Equality check for `DynamicTable` returns False if the other object is a `DynamicTable` instead of raising an error.
@rly (#566)
- Update ruamel.yaml usage to new API. @rly (#587)
Expand All @@ -33,6 +43,7 @@
- Fix CI testing on Python 3.9. @rly (#523)
- Fix certain edge cases where `GroupValidator` would not validate all of the child groups or datasets
attached to a `GroupBuilder`. @dsleiter (#526)
- Fix bug for generating classes from link specs and ignored 'help' fields. @rly (#535)
- Various fixes for dynamic class generation. @rly (#561)
- Fix generation of classes that extends both `MultiContainerInterface` and another class that extends
`MultiContainerInterface`. @rly (#567)
Expand Down
32 changes: 28 additions & 4 deletions azure-pipelines-nightly.yml
Original file line number Diff line number Diff line change
Expand Up @@ -19,15 +19,27 @@ jobs:
imageName: 'macos-10.15'
pythonVersion: '3.9'
testToxEnv: 'py39'
coverageToxEnv: ''
buildToxEnv: 'build-py39'
testWheelInstallEnv: 'wheelinstall'

macOS-py3.9-upgrade-dev-pre:
imageName: 'macos-10.15'
pythonVersion: '3.9'
testToxEnv: 'py39-upgrade-dev-pre'
buildToxEnv: 'build-py39-upgrade-dev-pre'
testWheelInstallEnv: 'wheelinstall'

macOS-py3.9-upgrade-dev:
imageName: 'macos-10.15'
pythonVersion: '3.9'
testToxEnv: 'py39'
buildToxEnv: 'build-py39'
testWheelInstallEnv: 'wheelinstall'

macOS-py3.8-upgrade-dev-pre:
imageName: 'macos-10.15'
pythonVersion: '3.8'
testToxEnv: 'py38-upgrade-dev-pre'
coverageToxEnv: ''
buildToxEnv: 'build-py38-upgrade-dev-pre'
testWheelInstallEnv: 'wheelinstall'

Expand Down Expand Up @@ -70,15 +82,27 @@ jobs:
imageName: 'vs2017-win2016'
pythonVersion: '3.9'
testToxEnv: 'py39'
coverageToxEnv: ''
buildToxEnv: 'build-py39'
testWheelInstallEnv: 'wheelinstall'

Windows-py3.9-upgrade-dev-pre:
imageName: 'vs2017-win2016'
pythonVersion: '3.9'
testToxEnv: 'py39-upgrade-dev-pre'
buildToxEnv: 'build-py39-upgrade-dev-pre'
testWheelInstallEnv: 'wheelinstall'

Windows-py3.9-upgrade-dev:
imageName: 'vs2017-win2016'
pythonVersion: '3.9'
testToxEnv: 'py39-upgrade-dev'
buildToxEnv: 'build-py39-upgrade-dev'
testWheelInstallEnv: 'wheelinstall'

Windows-py3.8-upgrade-dev-pre:
imageName: 'vs2017-win2016'
pythonVersion: '3.8'
testToxEnv: 'py38-upgrade-dev-pre'
coverageToxEnv: ''
buildToxEnv: 'build-py38-upgrade-dev-pre'
testWheelInstallEnv: 'wheelinstall'

Expand Down
1 change: 0 additions & 1 deletion azure-pipelines.yml
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,6 @@ jobs:
vmImage: $(imageName)

steps:

- checkout: self
submodules: true

Expand Down
10 changes: 5 additions & 5 deletions requirements-dev.txt
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
# pinned dependencies to reproduce an entire development environment to use HDMF, run HDMF tests, check code style,
# compute coverage, and create test environments
codecov==2.1.10
coverage==5.3
flake8==3.8.4
codecov==2.1.11
coverage==5.5
flake8==3.9.1
flake8-debugger==4.0.0
flake8-print==4.0.0
importlib-metadata<2
importlib-metadata==4.0.1
python-dateutil==2.8.1
tox==3.20.1
tox==3.23.0
2 changes: 1 addition & 1 deletion requirements-min.txt
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# package dependencies and their minimum versions for installing HDMF
# the requirements here specify '==' for testing; setup.py replaces '==' with '>='
h5py==2.9,<3 # support for setting attrs to lists of utf-8 added in 2.9
numpy==1.16,<1.19.4
numpy==1.16,<1.21
scipy==1.1,<2
pandas==0.23,<2
ruamel.yaml==0.15,<1
Expand Down
4 changes: 2 additions & 2 deletions requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,6 @@
h5py==2.10.0
numpy==1.19.3
scipy==1.5.4
pandas==1.1.4
ruamel.yaml==0.16.12
pandas==1.1.5
ruamel.yaml==0.17.4
jsonschema==3.2.0
2 changes: 1 addition & 1 deletion setup.cfg
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ exclude =
src/hdmf/_due.py
per-file-ignores =
docs/gallery/*:E402,T001
docs/source/tutorials/*:E402
docs/source/tutorials/*:E402,T001
src/hdmf/__init__.py:F401
src/hdmf/backends/__init__.py:F401
src/hdmf/backends/hdf5/__init__.py:F401
Expand Down
69 changes: 69 additions & 0 deletions tests/unit/common/test_common_io.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,69 @@
from h5py import File

from hdmf.backends.hdf5 import HDF5IO
from hdmf.common import Container, get_manager
from hdmf.spec import NamespaceCatalog
from hdmf.testing import TestCase, remove_test_file

from tests.unit.utils import get_temp_filepath


class TestCacheSpec(TestCase):
"""Test caching spec specifically with the namespaces provided by hdmf.common.
See also TestCacheSpec in tests/unit/test_io_hdf5_h5tools.py.
"""

def setUp(self):
self.manager = get_manager()
self.path = get_temp_filepath()
self.container = Container('dummy')

def tearDown(self):
remove_test_file(self.path)

def test_write_no_cache_spec(self):
"""Roundtrip test for not writing spec."""
with HDF5IO(self.path, manager=self.manager, mode="a") as io:
io.write(self.container, cache_spec=False)
with File(self.path, 'r') as f:
self.assertNotIn('specifications', f)

def test_write_cache_spec(self):
"""Roundtrip test for writing spec and reading it back in."""
with HDF5IO(self.path, manager=self.manager, mode="a") as io:
io.write(self.container)
with File(self.path, 'r') as f:
self.assertIn('specifications', f)
self._check_spec()

def test_write_cache_spec_injected(self):
"""Roundtrip test for writing spec and reading it back in when HDF5IO is passed an open h5py.File."""
with File(self.path, 'w') as fil:
with HDF5IO(self.path, manager=self.manager, file=fil, mode='a') as io:
io.write(self.container)
with File(self.path, 'r') as f:
self.assertIn('specifications', f)
self._check_spec()

def _check_spec(self):
ns_catalog = NamespaceCatalog()
HDF5IO.load_namespaces(ns_catalog, self.path)
self.maxDiff = None
for namespace in self.manager.namespace_catalog.namespaces:
with self.subTest(namespace=namespace):
original_ns = self.manager.namespace_catalog.get_namespace(namespace)
cached_ns = ns_catalog.get_namespace(namespace)
ns_fields_to_check = list(original_ns.keys())
ns_fields_to_check.remove('schema') # schema fields will not match, so reset
for ns_field in ns_fields_to_check:
with self.subTest(namespace_field=ns_field):
self.assertEqual(original_ns[ns_field], cached_ns[ns_field])
for dt in original_ns.get_registered_types():
with self.subTest(data_type=dt):
original_spec = original_ns.get_spec(dt)
cached_spec = cached_ns.get_spec(dt)
with self.subTest('Data type spec is read back in'):
self.assertIsNotNone(cached_spec)
with self.subTest('Cached spec matches original spec'):
self.assertDictEqual(original_spec, cached_spec)
Loading

0 comments on commit 5d903ab

Please sign in to comment.