Redis is an open source (BSD licensed), in-memory data structure store used as a database, cache, message broker, and streaming engine. Redis provides data structures such as strings, hashes, lists, sets, sorted sets with range queries, bitmaps, hyperloglogs, geospatial indexes, and streams. Redis has built-in replication, Lua scripting, LRU eviction, transactions, and different levels of on-disk persistence, and provides high availability via Redis Sentinel and automatic partitioning with Redis Cluster.
To achieve top performance, Redis works with an in-memory dataset. Depending on your use case, Redis can persist your data either by periodically dumping the dataset to disk or by appending each command to a disk-based log. You can also disable persistence if you just need a feature-rich, networked, in-memory cache.
Sending a Decodable data stream to Redis is accomplished in two stages, first by creating a sink connector to a data source that is supported by Redis, and then by adding that data source to your Redis configuration. Decodable and Redis mutually support several technologies, including Confluent Cloud.
This example demonstrates using Confluent Cloud as the sink from Decodable and the source for Redis. Sign in to the Decodable Web Console and follow the configuration steps provided for the Confluent Cloud Connector to create a
sink connector. For examples of using the command line tools or scripting, see the How To guides.
The Kafka Connect Redis Sink connector for Confluent Cloud is used to export data from Apache Kafka® topics to Redis. The connector works with Redis Enterprise Cloud, Azure Cache for Redis, and Amazon ElastiCache for Redis. Its primary features are:
- At least once delivery: The connector guarantees that records are delivered at least once.
- Supports multiple tasks: The connector supports running one or more tasks.
- SSL support: Supports one-way SSL.
- Deletions: The connector supports deletions. If the record stored in Kafka has a null value, the connector sends a delete message with the corresponding key to Redis.
- Supported input data formats: This connector supports storing raw bytes or strings (as inserts) in Redis.
Use the Confluent Cloud console to perform the following steps:
- Launch your Confluent Cloud cluster
- Add a Redis Sink connector
- Enter the connector details
- Select a topic
- Provide Kafka credentials
- Enter your Databricks Delta Lake connection details for Redis
- Check the results in Redis
For more detailed information, please refer to Confluent Cloud's Redis documentation.
|Delivery guarantee||at least once|
Updated about 1 year ago