SingleStore DB sink integration

This is a supported integration which requires manual configuration.

If you are interested in native support with a Decodable connector drop us a line or join our Slack community and let us know!

Sending a Decodable data stream to SingleStore is accomplished in two stages:

  1. Creating a sink connector from Decodable to a data source that’s supported by SingleStore

  2. Adding that data source to your SingleStore configuration.

Decodable and SingleStore mutually support several technologies, including the following:

  • Amazon S3

  • Apache Kafka

Add a Kafka sink connector

Follow the configuration steps provided for the Apache Kafka sink connector.

Create Kafka data source

To load data from Kafka into SingleStore, do one of the following steps:

  1. Load data from the Kafka Connector. The SingleStore Kafka Connector is a Kafka Connect connector that ingests AVRO, JSON, and CSV messages from Kafka topics into SingleStoreDB. More specifically, the Kafka Connector is a Sink (target) connector designed to read data from Kafka topics and write that data to SingleStoreDB tables.

  2. Alternatively, load data from Kafka using a pipeline. You will first need to upload a certificate to use to Connect via TLS/SSL. Then you can create a Kafka Pipeline in SingleStore.

For more detailed information, see SingleStore’s Kafka documentation.