SingleStore DB sink integration This is a supported integration which requires manual configuration. Contact Decodable support if you are interested in native support with a Decodable connector. Sending a Decodable data stream to SingleStore is accomplished in two stages: Creating a sink connector from Decodable to a data source that’s supported by SingleStore Adding that data source to your SingleStore configuration. Decodable and SingleStore mutually support several technologies, including the following: Amazon S3 Apache Kafka Add a Kafka sink connector Follow the configuration steps provided for the Apache Kafka sink connector. Create Kafka data source To load data from Kafka into SingleStore, do one of the following steps: Load data from the Kafka Connector. The SingleStore Kafka Connector is a Kafka Connect connector that ingests Avro, JSON, and CSV messages from Kafka topics into SingleStoreDB. More specifically, the Kafka Connector is a Sink (target) connector designed to read data from Kafka topics and write that data to SingleStoreDB tables. Alternatively, load data from Kafka using a pipeline. You will first need to upload a certificate to use to Connect via TLS/SSL. Then you can create a Kafka Pipeline in SingleStore. For more detailed information, see SingleStore’s Kafka documentation.