SingleStore DB sink connector

SingleStoreDB is a distributed, relational database that handles both transactions and real-time analytics at scale. It is accessible through standard SQL drivers and supports ANSI SQL syntax including joins, filters, and other analytical capabilities like aggregate, group by, and windowing functions.

SingleStoreDB scales horizontally on cloud instances or industry-standard hardware, providing high throughput across a wide range of platforms. The SingleStoreDB database maintains broad compatibility with common technologies in the modern data processing ecosystem, such as orchestration platforms, developer IDEs, and BI tools, so you can easily integrate it in your existing environment. It features an in-memory rowstore and an on-disk columnstore to handle both highly concurrent operational and analytical workloads. SingleStoreDB also features a data ingestion technology called SingleStore Pipelines that streams large amounts of data at high throughput into the database with exactly-once semantics.

If you are interested in a direct Decodable Connector for SingleStore, contact support@decodable.co or join our Slack community and let us know!

Getting started

Sending a Decodable data stream to SingleStore is accomplished in two stages, first by creating a sink connector to a data source that is supported by SingleStore, and then by adding that data source to your SingleStore configuration. Decodable and SingleStore mutually support several technologies, including the following:

  • Amazon S3

  • Apache Kafka

Configure as a sink

This example demonstrates using Kafka as the sink from Decodable and the source for SingleStore. Sign in to Decodable Web and follow the configuration steps provided in the Apache Kafka sink connector topic to create a sink connector. For examples of using the command line tools or scripting, see the How To guides.

Create Kafka data source

To load data from Kafka into SingleStore, do the following steps:

  1. Securely connect to Kafka from SingleStoreDB. You can connect via SSL and optionally authenticate through SASL. The following SASL authentication mechanisms are supported: GSSAPI (Kerberos), PLAIN, SCRAM-SHA-256, and SCRAM-SHA-512.

  2. Load data from the Confluent Kafka Connector. The SingleStore Confluent Kafka Connector is a Kafka Connect connector that ingests AVRO, JSON, and CSV messages from Kafka topics into SingleStoreDB. More specifically, the Confluent Kafka Connector is a Sink (target) connector designed to read data from Kafka topics and write that data to SingleStoreDB tables.

  3. Alternatively, load data from Kafka using a pipeline. You will first need to upload a certificate to use to Connect via TLS/SSL. Then you can create a Kafka Pipeline in SingleStore.

  4. Test your Kafka Cluster using kcat (formerly kafkacat).

For more detailed information, see SingleStore’s Kafka documentation.