Imply Enterprise sink integration

This is a supported integration which requires manual configuration.

Contact Decodable support if you are interested in native support with a Decodable connector.

Sending a Decodable data stream to Imply Enterprise is accomplished in two stages:

  1. Creating a sink connector from Decodable to a data source that’s supported by Imply

  2. Adding that data source to your Imply Enterprise configuration.

Decodable and Imply mutually support several technologies, including the following:

  • Amazon Kinesis

  • Apache Kafka

Add a Kafka sink connector

Follow the configuration steps provided for the Apache Kafka sink connector.

Create Kafka data source

Use the Kafka indexing service to ingest data into Imply from Kafka. This service offers exactly-once ingestion guarantees as well as the ability to ingest historical data. You can load data from Kafka in the Druid Console using the Kafka data loader as follows:

  1. Enable Druid Kafka ingestion

  2. Load real-time data from Kafka

  3. Build a data cube

For more detailed information, see the Kafka ingestion tutorial in the Imply documentation.