Rockset sink connector

Rockset is a real-time analytics database which enables queries on massive, semi-structured data without operational burden. Rockset is serverless and fully managed. It offloads the work of managing configuration, cluster provisioning, denormalization, and shard/index management. Rockset is also SOC 2 Type II compliant and offers encryption at rest and in flight, securing and protecting any sensitive data.

Rockset is compute-optimized, making it suitable for serving high-concurrency applications in the sub-100 terabyte range or larger with roll-ups, such as hundreds of TBs. It’s not a data warehouse and is not suitable for occasional queries on a PB-scale dataset.

If you are interested in a direct Decodable Connector for Rockset, contact support@decodable.co or join our Slack community and let us know!

Getting started

Sending a Decodable data stream to Rockset is accomplished in two stages, first by creating a sink connector to a data source that is supported by Rockset, and then by adding that data source to your Rockset configuration. Decodable and Rockset mutually support the following technologies:

  • Amazon Kinesis

  • Amazon S3

  • Apache Kafka

  • Confluent Cloud

  • PostgreSQL

Configure as a sink

This example demonstrates using Kafka as the sink from Decodable and the source for Rockset. Sign in to Decodable Web and follow the configuration steps provided in the Apache Kafka sink connector topic to create a sink connector. For examples of using the command line tools or scripting, see the How To guides.

Create Kafka data source

Rockset supports Kafka clusters in Confluent Cloud as well as Confluent Platform or Apache Kafka. To use Confluent Cloud with Rockset, follow these steps:

  1. Identify your bootstrap server URL. You can find this on Confluent Cloud by navigating to menu:Cluster settings[Identification].

  2. Identify your Confluent Cloud cluster’s API key and secret. You can create a new API key or use an existing one from the API keys tab under Data integration.

  3. Input these fields into the Kafka integration creation form and save your integration.

For more information, see Rockset’s Kafka documentation.