Redis sink integration

This is a supported integration which requires manual configuration.

Contact Decodable support if you are interested in native support with a Decodable connector.

Sending a Decodable data stream to Redis is accomplished in two stages:

  1. Creating a sink connector from Decodable to a data source that’s supported by Redis

  2. Adding that data source to your Redis configuration.

Decodable and Redis mutually support several technologies, including Confluent Cloud.

Add a Confluent Cloud sink connector

Follow the configuration steps provided for the Confluent Cloud sink connector.

Create Confluent Cloud data source in Redis

The Kafka Connect Redis Sink connector for Confluent Cloud is used to export data from Apache Kafka® topics to Redis. The connector works with Redis Enterprise Cloud, Azure Cache for Redis, and Amazon ElastiCache for Redis. Its primary features are:

  • At least once delivery: The connector guarantees that records are delivered at least once.

  • Supports multiple tasks: The connector supports running one or more tasks.

  • SSL support: Supports one-way SSL.

  • Deletions: The connector supports deletions. If the record stored in Kafka has a null value, the connector sends a delete message with the corresponding key to Redis.

  • Supported input data formats: This connector supports storing raw bytes or strings (as inserts) in Redis.

Use the Confluent Cloud console to do the following steps:

  1. Open your Confluent Cloud cluster

  2. Add a Redis Sink connector

  3. Enter the connector details

    • Select a topic

    • Provide Kafka credentials

    • Enter your Databricks Delta Lake connection details for Redis

  4. Check the results in Redis

For more detailed information, see Confluent Cloud’s Redis documentation.