How To: Set up SASL Authentication with Apache Kafka

A common method for securing client connections to Apache Kafka brokers is through SASL. Decodable supports a number of SASL authentication mechanisms, both with and without SSL/TLS encryption.

Consult the Apache Kafka source connector page for full details and configuration reference.
Self-signed certificates aren’t supported

When using SASL authentication with SSL/TLS encryption (SASL_SSL), the broker must be configured to use a trusted certificate.

This page will explain how to set up SASL authentication for your connection from Kafka to Decodable.

We’ll assume here that you already have a Decodable account and have gotten started with the Decodable CLI. If you haven’t done that yet, see The Decodable CLI to learn how to install and setup the Decodable CLI.

Create a Stream

decodable stream create --name kafka_sasl_input_stream  \
  --description "input stream"                          \
  --field value=string

Create a connection

When configuring a SASL connection, there are several required configuration properties.

Here is an example resource YAML that describes a Kafka SASL connection you can create using the Decodable CLI:

---
kind: connection
metadata:
  name: kafka-sasl-source
  description: Kafka source connection with SASL/SSL
spec_version: v2
spec:
  connector: kafka
  type: source
  stream_mappings:
    - stream_name: kafka_sasl_input_stream
      external_resource_specifier:
        topic: source_topic
  properties:
    value.format: json
    bootstrap.servers: <broker_list>
    security.protocol: SASL_SSL
    sasl.mechanism: PLAIN
    sasl.username: <username>
    sasl.password: <password>
sasl.password must be supplied as a secret.

Test the connection

The quickest way to test the connection is to activate it and run a preview job. After activation, we can verify that the connection is activated successfully by checking the actual state.

Note that it can take up to 1 minute for the state to update.

Activate the connection

$ decodable connection activate <connection_id>

$ decodable connection get <connection_id>

kafka-sasl-source
id                          <connection_id>
description
connector                   kafka
type                        source
stream mappings
  stream id                 <stream_id>
  external resource specifier
    topic                   source_topic
schema
properties
  value.format:             json
  bootstrap.servers:        <broker_list>
  security.protocol:        SASL_SSL
  sasl.mechanism:           PLAIN
  sasl.username:            <username>
  sasl.password:            <password>
[…]

Create a preview job

Run a preview to read from the stream the source Kafka connection writes into. If you produce JSON strings to the source_topic topic, you should see sample data coming out from the preview command output.

Note that it can take up to 1 minute for data to appear.

decodable pipeline preview "SELECT * FROM kafka_sasl_input_stream"