Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Wip/codecs shards #135

Draft
wants to merge 130 commits into
base: master
Choose a base branch
from
Draft
Changes from all commits
Commits
Show all changes
130 commits
Select commit Hold shift + click to select a range
c4f96f2
feat: wip toward codecs
bogovicj May 21, 2024
14d3b69
feat(wip): reading and writing blocks uses codecs
bogovicj May 28, 2024
76e1ec2
Merge remote-tracking branch 'origin/codecs' into dev/shards
bogovicj Jul 17, 2024
adb84bb
feat: add ChecksumCodec
bogovicj Jul 18, 2024
c54bdb9
wip: use CheckedInput/Output streams from java.util.zip
bogovicj Jul 19, 2024
e756f8e
wip: add ChecksumException
bogovicj Jul 19, 2024
5a82159
refactor: codec getId to getName
bogovicj Jul 23, 2024
1cd307c
feat: add BytesCodec
bogovicj Jul 23, 2024
72cd669
feat: add ShardingCodec
bogovicj Jul 23, 2024
6fbb729
feat(wip): add prelim Shard classes
bogovicj Jul 23, 2024
a213190
feat(wip): update core n5 api with sharding
bogovicj Jul 23, 2024
352427e
wip: more shard/codec work
cmhulbert Jul 23, 2024
ce0b25d
feat/wip: add size partial read lockForReading methods
bogovicj Jul 24, 2024
8aabcf4
wip/feat: ShardIndex progress, VirtualShard progress
bogovicj Jul 24, 2024
77c096d
feat/wip: add VirtualShard
bogovicj Jul 24, 2024
6874c39
test: reading a zarr shard demo
bogovicj Jul 24, 2024
be87013
feat: make partial writes possible for key value access
bogovicj Jul 24, 2024
c5ee84d
fix: AbstractShard getBlockSize
bogovicj Jul 24, 2024
0b4d73c
wip: toward block writing through shard
bogovicj Jul 24, 2024
083cb54
fix: partial write defaults with 0
cmhulbert Aug 1, 2024
bdadb58
wip: zone serialization and sharding?
cmhulbert Aug 2, 2024
1990424
wip/feat: add VirtualShard.createIndex
bogovicj Aug 2, 2024
1931530
fix: return KVA return for types for ranged lockForReading/Writing
bogovicj Aug 2, 2024
786ec1f
feat: add getAttributesKey
bogovicj Aug 5, 2024
bf4f63e
wip: move getAttributesKey to GsonN5Reader
bogovicj Aug 5, 2024
b01b6be
wip add constant N5_DATASET_ATTRIBUTES
bogovicj Aug 6, 2024
0cb5a0f
refactor: Compression interface extends Codec
bogovicj Aug 8, 2024
553c85e
wip: BytesCodec update
bogovicj Aug 9, 2024
35a4d35
test: remove outdated wip config parsing
cmhulbert Aug 9, 2024
d636b83
feat: annotations for extensible serialization for codecs
cmhulbert Aug 12, 2024
1d9c0c2
refactor: rename former 'chunkGrid' variables
bogovicj Aug 12, 2024
c7fb316
test: start CodecSerialization test
bogovicj Aug 12, 2024
344aa46
test: codec array with a compressor
bogovicj Aug 12, 2024
6b1e9a1
test: deserialization behavior
bogovicj Aug 12, 2024
664007b
feat: more shard and codec work
cmhulbert Aug 14, 2024
21ab72c
test: minor updates
bogovicj Aug 14, 2024
314b199
perf: BlockReader have read call static method
bogovicj Aug 14, 2024
5ad00fd
feat: WIP initial read/write blocks through composed codecs
cmhulbert Aug 16, 2024
f38fef7
fix: isDataset caching
cmhulbert Aug 23, 2024
9155ec5
test: fix FixedScaleOffsetTests
bogovicj Aug 26, 2024
e8cfefd
fix: LockedFileChannel locking
bogovicj Aug 26, 2024
17ef0fe
fix: LockedFileChannel truncation logic
bogovicj Aug 26, 2024
2e18288
fix: NameConfigAdapter avoid NPE
bogovicj Aug 26, 2024
7180290
pref: normalGetDatasetAttributes should call createDatasetAttributes
bogovicj Aug 26, 2024
764d05a
feat: DataBlock methods to read/write directly from DataInput/Output
bogovicj Sep 3, 2024
dbd7b15
refactor: create N5BytesCodec, BytesCodec is simple zarr approach
bogovicj Sep 4, 2024
786e4d5
chore(pom): depend on guava
bogovicj Sep 4, 2024
2a7327a
wip: undo ArrayToBytes codec changes
bogovicj Sep 4, 2024
c24605d
fix(wip): DefaultBlockReader/Writer can get ArrayToBytesCodec directly
bogovicj Sep 4, 2024
5f8fa62
test/fix: BytesTests now works with refactor
bogovicj Sep 4, 2024
784a4a2
wip: toward supporting endianness
bogovicj Sep 4, 2024
87456a4
wip: DatasetAttributes allow empty codecs
bogovicj Sep 4, 2024
7ef4297
style: import order
bogovicj Sep 4, 2024
b80fef0
fix: BytesTest N5BytesCodec has name "n5bytes"
bogovicj Sep 4, 2024
ae1f347
feat: wip shard/codec support
cmhulbert Sep 16, 2024
7a1bbcb
feat: rethrow NoSuchFile as NoSuchKey
cmhulbert Sep 17, 2024
9fac328
feat: wip support for ShardIndex location and bytesorder
cmhulbert Sep 17, 2024
48695aa
Merge pull request #128 from cmhulbert/wip/codecsShards
bogovicj Sep 17, 2024
f667bb5
fix: ShardingCodec indexLocation should default to END
bogovicj Sep 19, 2024
6635107
style: ShardingCodec
bogovicj Sep 19, 2024
4489116
fix: getBlockPositionInShard
bogovicj Sep 20, 2024
eb6de85
fix/wip: sharding codec block sizes needs reversing in zarr
bogovicj Sep 20, 2024
a8df678
feat: add getShardAttributes method to DatasetAttributes
bogovicj Sep 20, 2024
865c861
wip: n5 exception and InMemoryShard
bogovicj Nov 19, 2024
fccdb9a
wip: dummy impl of writeShard
bogovicj Nov 19, 2024
19b8618
doc: getShardSize
bogovicj Dec 16, 2024
3229fdd
fix: ShardedDatasetAttributes.getBlockPosition
bogovicj Dec 16, 2024
b270ece
perf: override writeData
bogovicj Dec 18, 2024
f46aa52
chore: bump pom-scijva to 40.0.0
bogovicj Dec 18, 2024
362c74d
perf: initialize cache only if using it
bogovicj Dec 20, 2024
4f6b49b
Merge branch 'wip/codecsShards' of github.com:saalfeldlab/n5 into wip…
bogovicj Dec 20, 2024
edbdef6
fix: make removeAttribute methods' behavior more consistent
bogovicj Dec 23, 2024
1b672de
fix: gzip make uzeZlib parameter optional
bogovicj Dec 23, 2024
bec2996
fix: BlockWriter should not close stream
bogovicj Dec 23, 2024
2c753a1
Revert "fix: BlockWriter should not close stream"
cmhulbert Jan 2, 2025
275aaa6
feat(test): parameterize ShardDemo read/write test; add new test
cmhulbert Jan 2, 2025
9183d11
feat(wip): toward an implementation of writeShard
bogovicj Jan 2, 2025
d4142c5
chore: stop using deprecated BoundedInputStream constructor
bogovicj Jan 2, 2025
bc5d103
fix: be quiet
bogovicj Jan 2, 2025
f294898
Merge branch 'master' into wip/codecsShards
cmhulbert Jan 2, 2025
1c9e601
fix(test): failing on github actions
cmhulbert Jan 2, 2025
b62dc00
fix/test: add writeShardTest
bogovicj Jan 2, 2025
1ced570
feat: writeShardEndStream
cmhulbert Jan 2, 2025
0e045cd
feat: serialize shardSize in DatasetAttributes for n5
bogovicj Jan 3, 2025
f98b11a
test: BytesTest operates on n5 container
bogovicj Jan 3, 2025
cc5b8c5
feat: ShardedDatasetAttributes validate shard/block size on construction
bogovicj Jan 3, 2025
367b987
refactor: remove `DatasetAttributes#getShardedAttributes()`
cmhulbert Jan 3, 2025
20a9677
feat: writeBlocks aggregate shard
cmhulbert Jan 3, 2025
c3c3ceb
fix: index offset calculation
cmhulbert Jan 3, 2025
5badbb7
feat(test): wip shard writeBlocks
cmhulbert Jan 3, 2025
eb9fbc1
feat/refactor: add BlockParameters and ShardParameters interfaces
bogovicj Jan 6, 2025
b0092d3
fix: null compression should result in empty byteCodecs list
bogovicj Jan 7, 2025
458295c
refactor: createIndex now a default method in ShardParameters
bogovicj Jan 7, 2025
da9b9ed
feat: add getByteOrder method for ArrayCodecs
bogovicj Jan 7, 2025
52f762e
feat: writeBlocks respects existing blocks in a given shard if not ov…
cmhulbert Jan 8, 2025
d4dcbe8
refactor: remove generic from ShardParameter
cmhulbert Jan 8, 2025
b2b3d2b
refactor: some signatures
cmhulbert Jan 8, 2025
52751f5
fix: Shard as Iterator<DataBlock<>>
cmhulbert Jan 8, 2025
1161859
feat: merge wip ShardParameters interface
cmhulbert Jan 8, 2025
6e3cbe5
test: ShardIndexTest
bogovicj Jan 8, 2025
0647755
feat: toward direct reading of InMemoryShard
bogovicj Jan 8, 2025
b17cb1f
feat/test: add block position iterator for shard
bogovicj Jan 9, 2025
334e46c
feat: ShardIndex get properties by block index
bogovicj Jan 9, 2025
8449382
feat: add Shard.getNumBlocks
bogovicj Jan 9, 2025
b6a5b4f
perf: VirtualShard smart override of getBlocks
bogovicj Jan 9, 2025
55682ee
fix: GridIterator iteration order
bogovicj Jan 9, 2025
707addf
feat: DataBlockIterator skips missing blocks in Shard
cmhulbert Jan 10, 2025
9a59de2
chore: rm unused ShardReader/Writer classes
bogovicj Jan 10, 2025
edb61a5
fix/test: clone gridPosition
bogovicj Jan 10, 2025
539959f
refactor: EMPTY_INDEX_NBYTES to ShardIndex
bogovicj Jan 10, 2025
0affc94
refactor: EMPTY_INDEX_NBYTES to ShardIndex
bogovicj Jan 10, 2025
dd962e0
Merge branch 'wip/codecsShards' of github.com:saalfeldlab/n5 into wip…
bogovicj Jan 10, 2025
7dde5bb
chore: rm unused method
bogovicj Jan 10, 2025
a791244
feat: Codec add composition helpers
bogovicj Jan 13, 2025
cd54053
fix: EMPTY_INDEX_NBYTES now in ShardIndex
bogovicj Jan 13, 2025
1c92d14
feat: add getBlocks(int[] blockIndexes)
bogovicj Jan 13, 2025
52aaa2a
feat: InMemoryShard add new write methods
bogovicj Jan 13, 2025
6fac5f2
wip: minor change to writeBlocks, implement readBlocks
bogovicj Jan 13, 2025
aca03d4
demo: BlockIterators
bogovicj Jan 14, 2025
367cadb
fix: ShardIndex.getOffsetIndex
bogovicj Jan 15, 2025
1a44168
feat: add Position
bogovicj Jan 17, 2025
43bc1e0
feat: add positionToIndex static methods in GridIterator
bogovicj Jan 17, 2025
0e353e4
feat: ShardParameters methods
bogovicj Jan 17, 2025
7097308
wip: rm unused flatIndex in Shard
bogovicj Jan 17, 2025
f67943d
refactor: InMemoryShard, read/writeBlocks
bogovicj Jan 17, 2025
9224215
wip/feat: N5Reader.readShard
bogovicj Jan 21, 2025
a74d343
feat: add ShardIndex.isEmpty
bogovicj Jan 21, 2025
ce50fa9
test: add a test for nested sharding codecs
bogovicj Jan 24, 2025
3bf8fe8
refactor: large codecs/shards implementation refactor
cmhulbert Feb 4, 2025
0d1eb52
refactor: more from refactorShard branch
cmhulbert Feb 4, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 3 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -22,7 +22,7 @@ N5 group is not a single file but simply a directory on the file system. Meta-d

1. All directories of the file system are N5 groups.
2. A JSON file `attributes.json` in a directory contains arbitrary attributes. A group without attributes may not have an `attributes.json` file.
3. The version of this specification is 4.0.0 and is stored in the "n5" attribute of the root group "/".
3. The version of this specification is 1.0.0 and is stored in the "n5" attribute of the root group "/".
4. A dataset is a group with the mandatory attributes:
* dimensions (e.g. [100, 200, 300]),
* blockSize (e.g. [64, 64, 64]),
@@ -38,7 +38,7 @@ N5 group is not a single file but simply a directory on the file system. Meta-d
* xz with parameters
* preset (integer, default 6).

Custom compression schemes with arbitrary parameters can be added using [compression annotations](#extensible-compression-schemes), e.g. [N5 Blosc](https://github.com/saalfeldlab/n5-blosc) and [N5 ZStandard](https://github.com/JaneliaSciComp/n5-zstandard/).
Custom compression schemes with arbitrary parameters can be added using [compression annotations](#extensible-compression-schemes), e.g. [N5 Blosc](https://github.com/saalfeldlab/n5-blosc).
5. Chunks are stored in a directory hierarchy that enumerates their positive integer position in the chunk grid (e.g. `0/4/1/7` for chunk grid position p=(0, 4, 1, 7)).
6. Datasets are sparse, i.e. there is no guarantee that all chunks of a dataset exist.
7. Chunks cannot be larger than 2GB (2<sup>31</sup>Bytes).
@@ -134,3 +134,4 @@ Custom compression schemes can be implemented using the annotation discovery mec
HDF5 is a great format that provides a wealth of conveniences that I do not want to miss. It's inefficiency for parallel writing, however, limit its applicability for handling of very large n-dimensional data.

N5 uses the native filesystem of the target platform and JSON files to specify basic and custom meta-data as attributes. It aims at preserving the convenience of HDF5 where possible but doesn't try too hard to be a full replacement.
Please do not take this project too seriously, we will see where it will get us and report back when more data is available.
33 changes: 24 additions & 9 deletions pom.xml
Original file line number Diff line number Diff line change
@@ -5,7 +5,7 @@
<parent>
<groupId>org.scijava</groupId>
<artifactId>pom-scijava</artifactId>
<version>38.0.1</version>
<version>40.0.0</version>
<relativePath />
</parent>

@@ -161,13 +161,36 @@
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
</dependency>
<dependency>
<groupId>org.scijava</groupId>
<artifactId>scijava-common</artifactId>
</dependency>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-compress</artifactId>
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
</dependency>

<!-- Test dependencies -->
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.janelia.saalfeldlab</groupId>
<artifactId>n5-universe</artifactId>
<exclusions>
<exclusion>
<groupId>org.janelia.saalfeldlab</groupId>
<artifactId>n5</artifactId>
</exclusion>
</exclusions>
<scope>test</scope>
</dependency>
<dependency>
<groupId>net.imagej</groupId>
<artifactId>ij</artifactId>
@@ -194,14 +217,6 @@
<version>${commons-collections4.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.scijava</groupId>
<artifactId>scijava-common</artifactId>
</dependency>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-compress</artifactId>
</dependency>
</dependencies>

<repositories>
21 changes: 21 additions & 0 deletions src/main/java/org/janelia/saalfeldlab/n5/AbstractDataBlock.java
Original file line number Diff line number Diff line change
@@ -25,6 +25,11 @@
*/
package org.janelia.saalfeldlab.n5;

import java.io.DataInput;
import java.io.DataOutput;
import java.io.IOException;
import java.nio.ByteBuffer;

/**
* Abstract base class for {@link DataBlock} implementations.
*
@@ -63,4 +68,20 @@ public T getData() {

return data;
}

@Override
public void readData(final DataInput input) throws IOException {

final ByteBuffer buffer = toByteBuffer();
input.readFully(buffer.array());
readData(buffer);
}

@Override
public void writeData(final DataOutput output) throws IOException {

final ByteBuffer buffer = toByteBuffer();
output.write(buffer.array());
}

}
11 changes: 11 additions & 0 deletions src/main/java/org/janelia/saalfeldlab/n5/BlockParameters.java
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
package org.janelia.saalfeldlab.n5;

public interface BlockParameters {

public long[] getDimensions();

public int getNumDimensions();

public int[] getBlockSize();

}
Original file line number Diff line number Diff line change
@@ -25,6 +25,8 @@
*/
package org.janelia.saalfeldlab.n5;

import java.io.DataInput;
import java.io.IOException;
import java.nio.ByteBuffer;

public class ByteArrayDataBlock extends AbstractDataBlock<byte[]> {
@@ -47,6 +49,12 @@ public void readData(final ByteBuffer buffer) {
buffer.get(getData());
}

@Override
public void readData(final DataInput inputStream) throws IOException {

inputStream.readFully(data);
}

@Override
public int getNumElements() {

18 changes: 16 additions & 2 deletions src/main/java/org/janelia/saalfeldlab/n5/Bzip2Compression.java
Original file line number Diff line number Diff line change
@@ -32,8 +32,10 @@
import org.apache.commons.compress.compressors.bzip2.BZip2CompressorInputStream;
import org.apache.commons.compress.compressors.bzip2.BZip2CompressorOutputStream;
import org.janelia.saalfeldlab.n5.Compression.CompressionType;
import org.janelia.saalfeldlab.n5.serialization.NameConfig;

@CompressionType("bzip2")
@NameConfig.Name("bzip2")
public class Bzip2Compression implements DefaultBlockReader, DefaultBlockWriter, Compression {

private static final long serialVersionUID = -4873117458390529118L;
@@ -52,17 +54,29 @@ public Bzip2Compression() {
}

@Override
public InputStream getInputStream(final InputStream in) throws IOException {
public InputStream decode(final InputStream in) throws IOException {

return new BZip2CompressorInputStream(in);
}

@Override
public OutputStream getOutputStream(final OutputStream out) throws IOException {
public InputStream getInputStream(final InputStream in) throws IOException {

return decode(in);
}

@Override
public OutputStream encode(final OutputStream out) throws IOException {

return new BZip2CompressorOutputStream(out, blockSize);
}

@Override
public OutputStream getOutputStream(final OutputStream out) throws IOException {

return encode(out);
}

@Override
public Bzip2Compression getReader() {

Original file line number Diff line number Diff line change
@@ -27,14 +27,14 @@

import java.lang.reflect.Type;

import com.google.gson.JsonSyntaxException;
import org.janelia.saalfeldlab.n5.N5Exception.N5IOException;
import org.janelia.saalfeldlab.n5.cache.N5JsonCache;
import org.janelia.saalfeldlab.n5.cache.N5JsonCacheableContainer;

import com.google.gson.Gson;
import com.google.gson.JsonElement;
import com.google.gson.JsonObject;
import com.google.gson.JsonSyntaxException;

/**
* {@link N5Reader} implementation through {@link KeyValueAccess} with JSON
@@ -70,7 +70,7 @@ default DatasetAttributes getDatasetAttributes(final String pathName) {
return null;

if (cacheMeta()) {
attributes = getCache().getAttributes(normalPath, N5KeyValueReader.ATTRIBUTES_JSON);
attributes = getCache().getAttributes(normalPath, getAttributesKey());
} else {
attributes = GsonKeyValueN5Reader.super.getAttributes(normalPath);
}
@@ -96,7 +96,7 @@ default <T> T getAttribute(

final JsonElement attributes;
if (cacheMeta()) {
attributes = getCache().getAttributes(normalPathName, N5KeyValueReader.ATTRIBUTES_JSON);
attributes = getCache().getAttributes(normalPathName, getAttributesKey());
} else {
attributes = GsonKeyValueN5Reader.super.getAttributes(normalPathName);
}
@@ -117,7 +117,7 @@ default <T> T getAttribute(
final String normalizedAttributePath = N5URI.normalizeAttributePath(key);
JsonElement attributes;
if (cacheMeta()) {
attributes = getCache().getAttributes(normalPathName, N5KeyValueReader.ATTRIBUTES_JSON);
attributes = getCache().getAttributes(normalPathName, getAttributesKey());
} else {
attributes = GsonKeyValueN5Reader.super.getAttributes(normalPathName);
}
@@ -133,7 +133,7 @@ default boolean exists(final String pathName) {

final String normalPathName = N5URI.normalizeGroupPath(pathName);
if (cacheMeta())
return getCache().isGroup(normalPathName, N5KeyValueReader.ATTRIBUTES_JSON);
return getCache().isGroup(normalPathName, getAttributesKey());
else {
return existsFromContainer(normalPathName, null);
}
@@ -176,7 +176,7 @@ default boolean datasetExists(final String pathName) throws N5IOException {

final String normalPathName = N5URI.normalizeGroupPath(pathName);
if (cacheMeta()) {
return getCache().isDataset(normalPathName, N5KeyValueReader.ATTRIBUTES_JSON);
return getCache().isDataset(normalPathName, getAttributesKey());
}
return isDatasetFromContainer(normalPathName);
}
@@ -208,7 +208,7 @@ default JsonElement getAttributes(final String pathName) throws N5IOException {

/* If cached, return the cache */
if (cacheMeta()) {
return getCache().getAttributes(groupPath, N5KeyValueReader.ATTRIBUTES_JSON);
return getCache().getAttributes(groupPath, getAttributesKey());
} else {
return GsonKeyValueN5Reader.super.getAttributes(groupPath);
}
Original file line number Diff line number Diff line change
@@ -59,9 +59,9 @@ default void createGroup(final String path) throws N5Exception {
// else if exists is true (then a dataset is present) so throw an exception to avoid
// overwriting / invalidating existing data
if (cacheMeta()) {
if (getCache().isGroup(normalPath, N5KeyValueReader.ATTRIBUTES_JSON))
if (getCache().isGroup(normalPath, getAttributesKey()))
return;
else if (getCache().exists(normalPath, N5KeyValueReader.ATTRIBUTES_JSON)) {
else if (getCache().exists(normalPath, getAttributesKey())) {
throw new N5Exception("Can't make a group on existing path.");
}
}
@@ -88,8 +88,8 @@ else if (getCache().exists(normalPath, N5KeyValueReader.ATTRIBUTES_JSON)) {
for (final String child : pathParts) {

final String childPath = parent.isEmpty() ? child : parent + "/" + child;
getCache().initializeNonemptyCache(childPath, N5KeyValueReader.ATTRIBUTES_JSON);
getCache().updateCacheInfo(childPath, N5KeyValueReader.ATTRIBUTES_JSON);
getCache().initializeNonemptyCache(childPath, getAttributesKey());
getCache().updateCacheInfo(childPath, getAttributesKey());

// only add if the parent exists and has children cached already
if (parent != null && !child.isEmpty())
@@ -130,7 +130,7 @@ default void writeAndCacheAttributes(
nullRespectingAttributes = getGson().toJsonTree(attributes);
}
/* Update the cache, and write to the writer */
getCache().updateCacheInfo(normalGroupPath, N5KeyValueReader.ATTRIBUTES_JSON, nullRespectingAttributes);
getCache().updateCacheInfo(normalGroupPath, getAttributesKey(), nullRespectingAttributes);
}
}

43 changes: 37 additions & 6 deletions src/main/java/org/janelia/saalfeldlab/n5/Compression.java
Original file line number Diff line number Diff line change
@@ -25,21 +25,29 @@
*/
package org.janelia.saalfeldlab.n5;

import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.io.Serializable;
import java.lang.annotation.ElementType;
import java.lang.annotation.Inherited;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;

import org.janelia.saalfeldlab.n5.codec.Codec;
import org.scijava.annotations.Indexable;

/**
* Deprecated: {@link Compression}s are no longer a special case.
* <br>
* Use {@link Codec.BytesCodec} for implementing compressors
* <p> </p>
* Compression scheme interface.
*
* @author Stephan Saalfeld
*/
public interface Compression extends Serializable {
public interface Compression extends Serializable, Codec.BytesCodec {

/**
* Annotation for runtime discovery of compression schemes.
@@ -49,7 +57,7 @@ public interface Compression extends Serializable {
@Inherited
@Target(ElementType.TYPE)
@Indexable
public static @interface CompressionType {
@interface CompressionType {

String value();
}
@@ -61,9 +69,10 @@ public interface Compression extends Serializable {
@Retention(RetentionPolicy.RUNTIME)
@Inherited
@Target(ElementType.FIELD)
public static @interface CompressionParameter {}
@interface CompressionParameter {}

public default String getType() {
@Override
default String getType() {

final CompressionType compressionType = getClass().getAnnotation(CompressionType.class);
if (compressionType == null)
@@ -72,7 +81,29 @@ public default String getType() {
return compressionType.value();
}

public BlockReader getReader();

public BlockWriter getWriter();
BlockReader getReader();

BlockWriter getWriter();

/**
* Decode an {@link InputStream}.
*
* @param in
* input stream
* @return the decoded input stream
*/
@Override
InputStream decode(InputStream in) throws IOException;

/**
* Encode an {@link OutputStream}.
*
* @param out
* the output stream
* @return the encoded output stream
*/
@Override
OutputStream encode(OutputStream out) throws IOException;

}
Loading