Imply Enterprise sink connector

Imply Enterprise is a self-service analytics platform that gives users powerful visualization, management, monitoring, and security components. With Apache Druid at its core, Imply powers real-time analytic workloads for event-driven data, making that data accessible and useful to more people in your organization. Common application areas for Imply include:

  • Clickstream analytics (web and mobile analytics)

  • Risk/fraud analysis

  • Network telemetry analytics (network performance monitoring)

  • Server metrics storage

  • Supply chain analytics (manufacturing metrics)

  • Application performance metrics

  • Business intelligence / OLAP

If you are interested in a direct Decodable Connector for Imply Enterprise, contact support@decodable.co or join our Slack community and let us know!

Getting started

Sending a Decodable data stream to Imply Enterprise is accomplished in two stages, first by creating a sink connector to a data source that’s supported by Imply, and then by adding that data source to your Imply Enterprise configuration. Decodable and Imply mutually support several technologies, including the following:

  • Amazon Kinesis

  • Apache Kafka

Configure as a sink

This example demonstrates using Kafka as the sink from Decodable and the source for Imply Enterprise. Sign in to Decodable Web and follow the configuration steps provided in the Apache Kafka sink connector topic to create a sink connector. For examples of using the command line tools or scripting, see the How To guides.

Create Kafka data source

Use the Kafka indexing service to ingest data into Imply from Kafka. This service offers exactly-once ingestion guarantees as well as the ability to ingest historical data. You can load data from Kafka in the Druid Console using the Kafka data loader as follows:

  1. Enable Druid Kafka ingestion

  2. Load real-time data from Kafka

  3. Build a data cube

For more detailed information, see the Kafka ingestion tutorial in the Imply documentation.