diff --git a/.gitignore b/.gitignore
index 796b21a..657d773 100644
--- a/.gitignore
+++ b/.gitignore
@@ -38,3 +38,6 @@ classes/
# MacOS
.DS_Store
+# config files
+vertx-proxy/src/main/resources/config.json
+
diff --git a/README.md b/README.md
index 15ed937..b24e900 100644
--- a/README.md
+++ b/README.md
@@ -6,26 +6,17 @@
The goal of this project is to provide proxy-based, topic-level encryption-at-rest for [Apache Kafka®](https://kafka.apache.org/).
-To learn more about the background and architecture of topic encryption, see our [overview document](doc/README.md).
-The next planned milestones in the project are:
+### Documentation
+To learn more about the background and architecture of topic encryption, see our [overview document](doc/README.md).
-## M1, May 14: Foundation
-- Technical specification of the project
-- Assessment of viable proxy
- - Envoy vs. a custom-developed proxy (in golang or Java)
+The [getting started guide](doc/getting-started.md) explains how to compile and run the encrypting proxy for testing and evaluation.
-## M2, June 04: Alpha proxy
-- Initial implementation of selected proxy architecture
- - stand-alone, not yet integrated
+### Project structure
+The project consists of two nested projects:
+- [encmod](encmod/), the topic encryption module
+- [vertx-proxy](vertx-proxy/), an experimental Kafka proxy for developing and testing the encryption module.
-## M3, June 18: Proxy integration evaluation
-- First version of the software encryption module
-- Integration of encryption module with proxy
-- Evaluation of proxy integration into Strimzi and build environment
-## M4, July 02: Alpha Strimzi integration
-- Integrate proxy with the Strimzi project
-- Integrate encryption module
diff --git a/common/pom.xml b/common/pom.xml
new file mode 100644
index 0000000..81795ef
--- /dev/null
+++ b/common/pom.xml
@@ -0,0 +1,6 @@
+
+ 4.0.0
+ strimzi.io
+ common
+ 0.0.1-SNAPSHOT
+
\ No newline at end of file
diff --git a/doc/README.md b/doc/README.md
index c37649e..064b484 100644
--- a/doc/README.md
+++ b/doc/README.md
@@ -4,7 +4,7 @@
# Proxy-Based Topic-level Encryption at Rest for Kakfka
-The goal of this project is to provide proxy-based, topic-level encryption-at-rest for [Apache Kafka](https://kafka.apache.org/).
+The goal of this project is to provide proxy-based, topic-level encryption-at-rest for [Apache Kafka](https://kafka.apache.org/). This document provides an overview of the motivation and architecture of the encrypting proxy. For more details, see the [references below](#references) for links to our research paper [1] and project proposal [2].
Although Kafka provides multiple authentication methods and encrypted communication over [TLS](https://en.wikipedia.org/wiki/Transport_Layer_Security), it does not encrypt data at rest.
Yet Kafka is increasingly used as a store of data, not just as a means of transferring data from one location to another.
@@ -28,12 +28,13 @@ The diagram below depicts the main components of the proposal, illustrating clie
One core component, the _Encryption Module_, provides the encryption functionality.
A second core component, the _Proxy_, intercepts Kafka connections and delegates message processing to the Encryption Module.
-Topic can be encrypted by different keys, allowing brokers to store a mix of encrypted and unencrypted data, with data owners managing the keys to their topics.
+Topics can be encrypted by different keys, allowing brokers to store a mix of encrypted and unencrypted data, with data owners managing the keys to their topics.
Keys will be stored in an external key management system with access policies and logging.
-In the coming weeks we will be providing the specification for the core components along with a roadmap.
We look forward to engaging with the Community in developing this exciting extension to Strimzi and Kafka!
-P.S. The original [Strimzi proposal #17](https://github.com/strimzi/proposals/blob/master/017-kafka-topic-encryption.md) provides additional background.
+## References
+1. [Securing Kafka with Encryption-at-Rest](https://ieeexplore.ieee.org/abstract/document/9671388/), Chris Giblin, Sean Rooney, Pascal Vetsch, and Adrian Preston, 2021 IEEE International Conference on Big Data (Big Data)
+2. The original [Strimzi proposal #17](https://github.com/strimzi/proposals/blob/master/017-kafka-topic-encryption.md) provides additional background.
diff --git a/doc/getting-started.md b/doc/getting-started.md
new file mode 100644
index 0000000..c0278f8
--- /dev/null
+++ b/doc/getting-started.md
@@ -0,0 +1,125 @@
+# Getting started
+
+Requirements:
+- a Kafka instance, version 2.8.0 or older, which you can configure
+- Java 17
+- Apache maven installed in your command line environment
+- git command
+
+
+The steps for getting started with this initial version of topic encryption are outlined below:
+1. Clone the repository and set your working path.
+2. Compile
+3. Configure the Kafka advertised address
+4. Configure the proxy
+5. Run the proxy
+6. Start kafka
+7. Run kafka clients
+
+Each of these steps is described in detail below with an example.
+
+## Scenario
+
+In the scenario to get started, all components run on the same system, `localhost`. The Kafka broker can also run remotely. The minimum requirement is that one can update the broker configuration file and restart the broker. In this example, however, we run the broker locally.
+
+The proxy will listen on port 1234 and the broker listens on its standard port 9092 as depicted below:
+
+```
+ Kafka client Proxy Kafka broker
+ o------------o 1234 o------------o 9092
+```
+
+The clients are reconfigured to use port 1234 (details below).
+
+A policy to encrypt all topics with the same key, along with a test key management system (KMS) which returns a hard-coded AES key, is used.
+
+The following sections provide details for each step in running the encrypting proxy.
+
+### 1. Clone the repository and set your working path
+```
+git clone git@github.com:strimzi/topic-encryption.git
+cd topic-encryption
+```
+
+### 2. Compile
+
+```
+mvn install
+```
+
+### 3. Configure the Kafka broker's listeners
+The address advertised by Kafka must be that of the proxy, not the broker itself.
+
+Modify the `advertised.listeners` property in `$KAFKA_HOME/config/server.properties` to point to the proxy host and port, as shown in the snippet below:
+
+```
+# The address the socket server listens on. It will get the value returned from
+# java.net.InetAddress.getCanonicalHostName() if not configured.
+# FORMAT:
+# listeners = listener_name://host_name:port
+# EXAMPLE:
+# listeners = PLAINTEXT://your.host.name:9092
+listeners=PLAINTEXT://:9092
+
+# Hostname and port the broker will advertise to producers and consumers. If not set,
+# it uses the value for "listeners" if configured. Otherwise, it will use the value
+# returned from java.net.InetAddress.getCanonicalHostName().
+advertised.listeners=PLAINTEXT://127.0.0.1:1234
+```
+Stop the Kafka broker and start it after the proxy is running.
+
+### 4. Configure the proxy
+Set the working directory to the proxy's target folder:
+```
+$ cd vertx-proxy/target/
+```
+
+Create a configuration file, `config.json` and add the following JSON contents:
+
+```
+{
+ "listening_port" : 1234,
+ "kafka_broker" : "localhost:9092",
+ "policy_repo" : "test"
+}
+```
+### 5. Run the proxy
+With the current path set to the target directory, run the proxy with the following Java invocation:
+
+```
+$ java -cp vertx-proxy-0.0.1-SNAPSHOT-fat.jar io.strimzi.kafka.proxy.vertx.VertRunner
+```
+
+If successfully started, the following output appears:
+```
+$ java -cp vertx-proxy-0.0.1-SNAPSHOT-fat.jar io.strimzi.kafka.proxy.vertx.VertRunner
+WARNING: sun.reflect.Reflection.getCallerClass is not supported. This will impact performance.
+2022-04-13 10:30:12 INFO KafkaProxyVerticle:46 35 - Kafka version: 2.8.0
+2022-04-13 10:30:12 INFO KafkaProxyVerticle:75 35 - Listening on port 1234
+```
+
+### 6. Start Kafka broker
+
+Now start the Kafka broker, for example:
+```
+$KAFKA_HOME/bin/kafka-server-start.sh config/server.properties
+```
+
+### 7. Run Kafka clients
+Start the Kafka console producer (note the proxy address in the broker list):
+
+```
+$KAFKA_HOME/bin/kafka-console-producer.sh --broker-list localhost:1234 --topic enctest --producer.config config/producer.properties
+```
+
+Start the Kafka console consumer, like the producer, specifying the proxy host and port:
+```
+$KAFKA_HOME/bin/kafka-console-consumer.sh --bootstrap-server localhost:1234 --consumer.config config/consumer.properties --topic enctest --from-beginning
+```
+
+Enter arbitry data in the producer and verify that it appears in consumer.
+
+Inspect the topic segment files and verify they indeed are encrypted.
+```
+$KAFKA_HOME/kafka-dump-log.sh --files /tmp/kafka-logs/enctest-0/00000000000000000000.log --value-decoder-class kafka.serializer.StringDecoder
+```
diff --git a/encmod/README.md b/encmod/README.md
new file mode 100644
index 0000000..4377af0
--- /dev/null
+++ b/encmod/README.md
@@ -0,0 +1,5 @@
+# Topic Encryption Module
+
+This component is concerned strictly with the encryption and decryption of Kafka records.
+
+
diff --git a/encmod/pom.xml b/encmod/pom.xml
new file mode 100644
index 0000000..9b6fbba
--- /dev/null
+++ b/encmod/pom.xml
@@ -0,0 +1,41 @@
+
+
+
+ io.strimzi
+ topic-encryption
+ 0.0.1-SNAPSHOT
+
+ 4.0.0
+ encmod
+ encryption module
+ desc
+
+
+
+ org.apache.kafka
+ kafka-clients
+
+
+ org.apache.logging.log4j
+ log4j-api
+
+
+ org.apache.logging.log4j
+ log4j-core
+
+
+ org.apache.logging.log4j
+ log4j-slf4j-impl
+
+
+ org.slf4j
+ slf4j-api
+
+
+ junit
+ junit
+ test
+
+
+
+
diff --git a/encmod/src/main/java/io/strimzi/kafka/topicenc/EncModControl.java b/encmod/src/main/java/io/strimzi/kafka/topicenc/EncModControl.java
new file mode 100644
index 0000000..f38bada
--- /dev/null
+++ b/encmod/src/main/java/io/strimzi/kafka/topicenc/EncModControl.java
@@ -0,0 +1,28 @@
+/*
+ * Copyright Strimzi authors.
+ * License: Apache License 2.0 (see the file LICENSE or http://apache.org/licenses/LICENSE-2.0.html).
+ */
+package io.strimzi.kafka.topicenc;
+
+/**
+ * This defines the interface to the Encryption Module to functions
+ * controlling its internal state. So, for example, can an implementation
+ * receiving events from a key management system (KMS), notify the module
+ * to purge a key because it has expired. If we consider the
+ * Encryption Module's encrypt() and decrypt() functions to comprise
+ * the data path, this interface describes its control path.
+ *
+ * Currently this interface is a placeholder but will be continually
+ * extended as the implementation matures.
+ */
+public interface EncModControl {
+
+ /**
+ * Purge the key, indicated by the keyRef argument, from any
+ * internal state such that the key in question is now longer used.
+ * This supports key revokation.
+ *
+ * @param keyref A key reference, understood by the Encryption Module and its KMS, identifying the key to purge.
+ */
+ void purgeKey(String keyref);
+}
diff --git a/encmod/src/main/java/io/strimzi/kafka/topicenc/EncryptionModule.java b/encmod/src/main/java/io/strimzi/kafka/topicenc/EncryptionModule.java
new file mode 100644
index 0000000..0302db3
--- /dev/null
+++ b/encmod/src/main/java/io/strimzi/kafka/topicenc/EncryptionModule.java
@@ -0,0 +1,253 @@
+/*
+ * Copyright Strimzi authors.
+ * License: Apache License 2.0 (see the file LICENSE or http://apache.org/licenses/LICENSE-2.0.html).
+ */
+package io.strimzi.kafka.topicenc;
+
+import java.nio.ByteBuffer;
+import java.security.GeneralSecurityException;
+import java.util.HashMap;
+import java.util.Map;
+
+import javax.crypto.SecretKey;
+
+import org.apache.kafka.common.message.FetchResponseData.FetchablePartitionResponse;
+import org.apache.kafka.common.message.FetchResponseData.FetchableTopicResponse;
+import org.apache.kafka.common.message.ProduceRequestData.PartitionProduceData;
+import org.apache.kafka.common.message.ProduceRequestData.TopicProduceData;
+import org.apache.kafka.common.record.CompressionType;
+import org.apache.kafka.common.record.MemoryRecords;
+import org.apache.kafka.common.record.MemoryRecordsBuilder;
+import org.apache.kafka.common.record.RecordBatch;
+import org.apache.kafka.common.record.SimpleRecord;
+import org.apache.kafka.common.record.TimestampType;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import io.strimzi.kafka.topicenc.enc.AesGcmEncrypter;
+import io.strimzi.kafka.topicenc.enc.EncData;
+import io.strimzi.kafka.topicenc.enc.EncrypterDecrypter;
+import io.strimzi.kafka.topicenc.kms.KeyMgtSystem;
+import io.strimzi.kafka.topicenc.policy.PolicyRepository;
+import io.strimzi.kafka.topicenc.policy.TopicPolicy;
+import io.strimzi.kafka.topicenc.ser.AesGcmV1SerDer;
+import io.strimzi.kafka.topicenc.ser.EncSerDer;
+import io.strimzi.kafka.topicenc.ser.EncSerDerException;
+
+
+/**
+ * This class is the encompassing, deployable component containing
+ * the Kafka topic encryption implementation.
+ */
+public class EncryptionModule implements EncModControl {
+
+ private static final Logger LOGGER = LoggerFactory.getLogger(EncryptionModule.class);
+
+ private PolicyRepository policyRepo;
+ private KeyMgtSystem kms;
+ private Map keyCache;
+ private EncSerDer encSerDer;
+
+ public EncryptionModule (PolicyRepository policyRepo, KeyMgtSystem kms) {
+ this.policyRepo = policyRepo;
+ this.kms = kms;
+ keyCache = new HashMap<>();
+ encSerDer = new AesGcmV1SerDer();
+ // init kms connection
+ // init policy Repo
+ // init kms cache
+ // create enc/dec
+ // init encdec cache
+ }
+
+ public boolean encrypt(TopicProduceData topicData)
+ throws EncSerDerException, GeneralSecurityException {
+
+ final EncrypterDecrypter encrypter;
+ try {
+ encrypter = getTopicEncrypter(topicData.name());
+ } catch (Exception e1) {
+ LOGGER.error("Error obtaining encrypter for topic " + topicData.name());
+ return false;
+ }
+
+ if (encrypter == null) {
+ LOGGER.debug(
+ "No encryption - topic {} is not configured for encryption",topicData.name());
+ return false;
+ }
+
+ // If this far, the data should be encrypted.
+ // Navigate into each record and encrypt.
+ for (PartitionProduceData partitionData : topicData.partitionData()) {
+
+ MemoryRecords recs = (MemoryRecords) partitionData.records();
+ MemoryRecordsBuilder builder = createMemoryRecsBuilder(recs.buffer().capacity());
+ for (org.apache.kafka.common.record.Record record : recs.records()) {
+ if (record.hasValue()) {
+ // encrypt record value:
+ byte[] plaintext = new byte[record.valueSize()];
+ record.value().get(plaintext);
+ EncData ciphertext = encrypter.encrypt(plaintext);
+
+ // serialize the ciphertext and metadata, add to the builder:
+ encSerDer.serialize(builder, record, ciphertext);
+ }
+ }
+ // overwrite the partition's memoryrecords with the encrypted records:
+ partitionData.setRecords(builder.build());
+ }
+ return true;
+ }
+
+ public boolean decrypt(FetchableTopicResponse fetchRsp)
+ throws EncSerDerException, GeneralSecurityException {
+
+ String topicName = fetchRsp.topic();
+ final EncrypterDecrypter encrypter;
+ try {
+ encrypter = getTopicEncrypter(topicName);
+ } catch (Exception e) {
+ LOGGER.error("Error obtaining encrypter for topic " + topicName, e);
+ return false;
+ }
+
+ if (encrypter == null) {
+ LOGGER.debug(
+ "No decryption - topic {} is not configured for encryption", topicName);
+ return false;
+ }
+
+ // If this far, the data was encrypted.
+ // Navigate into each record and decrypt.
+ for (FetchablePartitionResponse partitionData : fetchRsp.partitionResponses()) {
+
+ if (LOGGER.isDebugEnabled()) {
+ String msg = String.format(
+ "partition: %d, logStartOffset: %08X, lastStableOffset: %08X, " +
+ "partition leader epoch: %04X",
+ partitionData.partition(),
+ partitionData.currentLeader().leaderEpoch(),
+ partitionData.logStartOffset(),
+ partitionData.lastStableOffset());
+ LOGGER.debug(msg);
+ }
+
+ MemoryRecords recs = (MemoryRecords) partitionData.recordSet();
+
+ long firstOffset = getFirstOffset(recs);
+ MemoryRecordsBuilder builder =
+ createMemoryRecsBuilder(recs.sizeInBytes(),
+ partitionData.currentLeader().leaderEpoch(),
+ firstOffset);
+ for (org.apache.kafka.common.record.Record record : recs.records()) {
+ if (record.hasValue()) {
+ byte[] ciphertext = new byte[record.valueSize()];
+ record.value().get(ciphertext);
+
+ // serialize value into version, iv, ciphertext:
+ EncData md = encSerDer.deserialize(ciphertext);
+
+ // decrypt, add to records builder:
+ byte[] plaintext = encrypter.decrypt(md);
+
+ SimpleRecord newRec = new SimpleRecord(record.timestamp(),
+ record.key(),
+ ByteBuffer.wrap(plaintext),
+ record.headers());
+ builder.append(newRec);
+ }
+ }
+ // overwrite the partition's memoryrecords with the decrypted records:
+ MemoryRecords newRecs = builder.build();
+ partitionData.setRecordSet(newRecs);
+ }
+ return true;
+ }
+
+ /**
+ * EncMod control interface. Empty, placeholder implementation
+ * for the time being.
+ */
+ @Override
+ public void purgeKey(String keyref) {
+ }
+
+ /**
+ * Consults the policy db whether a topic is to be encrypted.
+ * If topic is not to be encrypted, returns null.
+ * @throws Exception
+ */
+ protected EncrypterDecrypter getTopicEncrypter (String topicName) throws Exception {
+
+ String topicKey = topicName.toLowerCase();
+
+ // first check cache
+ EncrypterDecrypter enc = keyCache.get(topicKey);
+ if (enc != null) {
+ return enc;
+ }
+
+ // query policy db for a policy for this topic:
+ TopicPolicy policy = policyRepo.getTopicPolicy(topicKey);
+ if (policy == null) {
+ return null; // no encryption policy for this topic. return null
+ }
+
+ // encryption policy exists for this topic. Retrieve key
+ SecretKey key = getKey(policy);
+
+ // instantiate the encrypter/decrypter for this topic
+ // todo: factory for creating type of encrypter - comes from policy
+ enc = new AesGcmEncrypter(key);
+
+ // add to cache and return
+ keyCache.put(topicKey, enc);
+ return enc;
+ }
+
+ private long getFirstOffset(MemoryRecords recs) {
+ for (org.apache.kafka.common.record.Record r : recs.records()) {
+ if (r.hasValue()) {
+ return r.offset();
+ }
+ }
+ return 0;
+ }
+
+ /**
+ * Given a encryption policy retrieve and return the encryption key.
+ * @param policy
+ * @return
+ * @throws Exception
+ */
+ private SecretKey getKey(TopicPolicy policy) throws Exception {
+ return kms.getKey(policy.getKeyReference());
+ }
+
+ private MemoryRecordsBuilder createMemoryRecsBuilder(int bufSize) {
+ return createMemoryRecsBuilder(bufSize, RecordBatch.NO_PARTITION_LEADER_EPOCH);
+ }
+
+ private MemoryRecordsBuilder createMemoryRecsBuilder(int bufSize, int partitionEpoch) {
+ return createMemoryRecsBuilder(bufSize, partitionEpoch, 0L);
+ }
+
+ private MemoryRecordsBuilder createMemoryRecsBuilder(int bufSize, int partitionEpoch, long baseOffset) {
+ ByteBuffer buffer = ByteBuffer.allocate(10); // will be expanded as needed
+ return new MemoryRecordsBuilder(
+ buffer,
+ RecordBatch.CURRENT_MAGIC_VALUE,
+ CompressionType.NONE,
+ TimestampType.CREATE_TIME,
+ baseOffset,
+ RecordBatch.NO_TIMESTAMP, // log appendTime
+ RecordBatch.NO_PRODUCER_ID,
+ RecordBatch.NO_PRODUCER_EPOCH,
+ 0, // baseSequence. partitionEpoch > 0 ? (partitionEpoch-1) : 0, RecordBatch.NO_SEQUENCE,
+ false, // isTransactional
+ false, // isBatch
+ partitionEpoch, // RecordBatch.NO_PARTITION_LEADER_EPOCH,
+ bufSize);
+ }
+}
diff --git a/encmod/src/main/java/io/strimzi/kafka/topicenc/LogUtils.java b/encmod/src/main/java/io/strimzi/kafka/topicenc/LogUtils.java
new file mode 100644
index 0000000..1f8bc66
--- /dev/null
+++ b/encmod/src/main/java/io/strimzi/kafka/topicenc/LogUtils.java
@@ -0,0 +1,63 @@
+/*
+ * Copyright Strimzi authors.
+ * License: Apache License 2.0 (see the file LICENSE or http://apache.org/licenses/LICENSE-2.0.html).
+ */
+package io.strimzi.kafka.topicenc;
+
+import java.util.Base64;
+
+public class LogUtils {
+
+ public static String base64Encode(byte[] rawBytes) {
+ return Base64.getEncoder().encodeToString(rawBytes);
+ }
+
+ public static byte[] base64Decode(String base64Str) {
+ return Base64.getDecoder().decode(base64Str);
+ }
+
+ public static void hexDump(String title, byte[] buffer) {
+
+ if (buffer == null) {
+ return;
+ }
+ if (title != null) {
+ title = String.format("%s (buffer.length=%d %04X bytes)", title, buffer.length, buffer.length);
+ System.out.println(title);
+ }
+ final String MID_FILLER = " ";
+ StringBuilder hex = new StringBuilder();
+ StringBuilder chars = new StringBuilder();
+ int numBytes = buffer.length;
+ int i = 0;
+ for (i = 0; i < numBytes; i++) {
+
+ if ((i > 0) && (i % 16 == 0)) {
+ hex.append(MID_FILLER);
+ hex.append(chars);
+ hex.append('\n');
+ chars = new StringBuilder();
+ }
+ byte b = buffer[i];
+ hex.append(String.format("%02X ", b));
+ if (b >= 0x20 && b < 0x7F) {
+ chars.append((char) b);
+ } else {
+ chars.append('.');
+ }
+ }
+
+ // loop over. add remainders
+ if (chars.length() > 0) {
+ for (int j = i % 16; j < 16; j++) {
+ hex.append(" ");
+ }
+ hex.append(MID_FILLER);
+ hex.append(chars);
+ hex.append('\n');
+ }
+ // for now, write to stdout
+ System.out.println(hex);
+ }
+
+}
diff --git a/encmod/src/main/java/io/strimzi/kafka/topicenc/enc/AesGcmEncrypter.java b/encmod/src/main/java/io/strimzi/kafka/topicenc/enc/AesGcmEncrypter.java
new file mode 100644
index 0000000..a38b551
--- /dev/null
+++ b/encmod/src/main/java/io/strimzi/kafka/topicenc/enc/AesGcmEncrypter.java
@@ -0,0 +1,70 @@
+/*
+ * Copyright Strimzi authors.
+ * License: Apache License 2.0 (see the file LICENSE or http://apache.org/licenses/LICENSE-2.0.html).
+ */
+package io.strimzi.kafka.topicenc.enc;
+
+import java.security.GeneralSecurityException;
+
+import javax.crypto.Cipher;
+import javax.crypto.SecretKey;
+import javax.crypto.spec.GCMParameterSpec;
+
+/**
+ * An Encrypter/Decrypter for AES GCM.
+ */
+public class AesGcmEncrypter implements EncrypterDecrypter {
+
+ public static final int IV_SIZE = 16; // bytes
+ public static final int KEY_SIZE = 128; // for now
+ private static final String JCE_PROVIDER = "SunJCE"; // for now
+
+ private final String transformation;
+ private final SecretKey key;
+
+ public AesGcmEncrypter(SecretKey key) {
+ this.key = key;
+ this.transformation = CryptoUtils.AES256_GCM_NOPADDING;
+ }
+
+ @Override
+ public EncData encrypt(byte[] plaintext) throws GeneralSecurityException {
+ byte[] iv = CryptoUtils.createRandom(IV_SIZE);
+ return encrypt(plaintext, iv);
+ }
+
+ @Override
+ public EncData encrypt(byte[] plaintext, byte[] iv) throws GeneralSecurityException {
+ Cipher encCipher = createEncryptionCipher(transformation, key, iv);
+ byte[] ciphertext = encCipher.doFinal(plaintext);
+ return new EncData(iv, ciphertext);
+ }
+
+ @Override
+ public byte[] decrypt(EncData encData) throws GeneralSecurityException {
+ // every encryption assumed to have its own IV
+ Cipher decCipher = createDecryptionCipher(transformation, key, encData.getIv());
+ return decCipher.doFinal(encData.getCiphertext());
+ }
+
+ private static Cipher createEncryptionCipher(String transformation, SecretKey key, byte[] iv)
+ throws GeneralSecurityException {
+ return createCipher(Cipher.ENCRYPT_MODE, transformation, key, iv);
+ }
+
+ private static Cipher createDecryptionCipher(String transformation, SecretKey key, byte[] iv)
+ throws GeneralSecurityException {
+ return createCipher(Cipher.DECRYPT_MODE, transformation, key, iv);
+ }
+
+ private static Cipher createCipher(int mode, String transformation, SecretKey key, byte[] iv)
+ throws GeneralSecurityException {
+ if (iv == null || iv.length == 0) {
+ throw new GeneralSecurityException("Initialization vector either null or empty.");
+ }
+ Cipher cipher = Cipher.getInstance(transformation, JCE_PROVIDER);
+ GCMParameterSpec gcmSpec = new GCMParameterSpec(KEY_SIZE, iv);
+ cipher.init(mode, key, gcmSpec);
+ return cipher;
+ }
+}
diff --git a/encmod/src/main/java/io/strimzi/kafka/topicenc/enc/CryptoUtils.java b/encmod/src/main/java/io/strimzi/kafka/topicenc/enc/CryptoUtils.java
new file mode 100644
index 0000000..b82c5ed
--- /dev/null
+++ b/encmod/src/main/java/io/strimzi/kafka/topicenc/enc/CryptoUtils.java
@@ -0,0 +1,65 @@
+/*
+ * Copyright Strimzi authors.
+ * License: Apache License 2.0 (see the file LICENSE or http://apache.org/licenses/LICENSE-2.0.html).
+ */
+package io.strimzi.kafka.topicenc.enc;
+
+import java.security.NoSuchAlgorithmException;
+import java.security.Provider;
+import java.security.SecureRandom;
+import java.security.Security;
+import java.util.Map;
+
+import javax.crypto.KeyGenerator;
+import javax.crypto.SecretKey;
+
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Commonly general purpose cryptographic functions and definitions.
+ */
+public class CryptoUtils {
+
+ private static final Logger LOGGER = LoggerFactory.getLogger(CryptoUtils.class);
+
+ public static final String AES = "AES";
+ public static final String AES_GCM_PADDING = AES + "/GCM/PKCS5Padding";
+ public static final String AES256_GCM_NOPADDING = "AES_256/GCM/NoPadding";
+
+ /**
+ * Create an array of bytes with random bits, suitable for use
+ * as nonce or initialization vector.
+ *
+ * @param sizeBytes
+ * @return
+ */
+ public static byte[] createRandom(int numBytes) {
+ byte[] buf = new byte[numBytes];
+ new SecureRandom().nextBytes(buf);
+ return buf;
+ }
+
+ public static SecretKey generateKey(String algo, int keySize) throws NoSuchAlgorithmException {
+ KeyGenerator kgen = KeyGenerator.getInstance(algo);
+ kgen.init(keySize);
+ return kgen.generateKey();
+ }
+
+ public static SecretKey generateAesKey(int keySize) throws NoSuchAlgorithmException {
+ return generateKey(AES, keySize);
+ }
+
+ public static void logCiphers() {
+ for (Provider provider : Security.getProviders()) {
+ LOGGER.debug("Cipher provider: {}", provider.getName());
+ for (Map.Entry