Skip to content

Getting Started Guides

Kryptonite for Kafka ships four independent integration modules. Pick the ones that fit your use cases and follow the module-specific guide.


Common Prerequisites

Kryptonite for Kafka modules share the same baseline requirements:

  • Java 17+ available on the host running the respective Kryptonite for Kafka module
  • Key material: generate keysets either with the purpose-built Keyset Tool or embed an existing Tink keyset
  • Apache Kafka: a running cluster reachable from the module's runtime

Generating a Keyset

Every module needs at least one Tink keyset configured. The most convenient and quickest way is to use the Keyset Tool and generate, for instance, a single keyset containing one AES_GCM key like so:

java -jar kryptonite-keyset-tool/target/kryptonite-keyset-tool-0.1.0.jar \
  -a AES_GCM -i my-demo-key -f FULL -p

This generates and pretty prints a turn-key ready keyset to stdout:

{
  "identifier": "my-demo-key",
  "material": {
    "primaryKeyId": 10000,
    "key": [{
      "keyData": {
        "typeUrl": "type.googleapis.com/google.crypto.tink.AesGcmKey",
        "value": "<BASE64_ENCODED_KEY>",
        "keyMaterialType": "SYMMETRIC"
      },
      "status": "ENABLED",
      "keyId": 10000,
      "outputPrefixType": "TINK"
    }]
  }
}

Key material is a secret

The value field is the actual raw key in Base64 encoding. Treat it with utmost secrecy, just like any important password. NEVER commit production keysets to source control!

See Key Management for production options regarding keyset storage and optional keyset encryption.