Singlestore DB
SingleStoreDB is a distributed, relational database that handles both transactions and real-time analytics at scale. It is accessible through standard SQL drivers and supports ANSI SQL syntax including joins, filters, and analytical capabilities (e.g. aggregates, group by, and windowing functions).
SingleStoreDB scales horizontally on cloud instances or industry-standard hardware, providing high throughput across a wide range of platforms. The SingleStoreDB database maintains broad compatibility with common technologies in the modern data processing ecosystem (e.g. orchestration platforms, developer IDEs, and BI tools), so you can easily integrate it in your existing environment. It features an in-memory rowstore and an on-disk columnstore to handle both highly concurrent operational and analytical workloads. SingleStoreDB also features a data ingestion technology called SingleStore Pipelines that streams large amounts of data at high throughput into the database with exactly-once semantics.
Getting Started
Sending a Decodable data stream to SingleStore is accomplished in two stages, first by creating a sink connector to a data source that is supported by SingleStore, and then by adding that data source to your SingleStore configuration. Decodable and SingleStore mutually support several technologies, including the following:
- Amazon S3
- Apache Kafka
Configure As A Sink
This example demonstrates using Kafka as the sink from Decodable and the source for SingleStore. Sign in to the Decodable Web Console and follow the configuration steps provided for the Kafka Connector to create a sink
connector. For examples of using the command line tools or scripting, see the How To guides.
Create Kafka Data Source
To load data from Kafka into SingleStore, perform the following steps:
-
Securely connect to Kafka from SingleStoreDB. You can connect via SSL and optionally authenticate through SASL. The following SASL authentication mechanisms are supported: GSSAPI (Kerberos), PLAIN, SCRAM-SHA-256, and SCRAM-SHA-512.
-
Load data from the Confluent Kafka Connector. The SingleStore Confluent Kafka Connector is a Kafka Connect connector that allows you to easily ingest AVRO, JSON, and CSV messages from Kafka topics into SingleStoreDB. More specifically, the Confluent Kafka Connector is a Sink (target) connector designed to read data from Kafka topics and write that data to SingleStoreDB tables.
-
Alternatively, load data from Kafka using a pipeline. You will first need to upload a certificate to use to Connect via TLS/SSL. Then you can create a Kafka Pipeline in SingleStore.
-
Test your Kafka Cluster using kcat (formerly kafkacat).
For more detailed information, please refer to SingleStore's Kafka documentation.
Reference
Connector name | singlestore-db |
Type | sink |
Additional SingleStore Support
If you are interested in a direct Decodable Connector for SingleStore, please contact [email protected] or join our Slack community and let us know!
Updated 10 months ago