Connect to a data destination
In order to get data out of Decodable for storage and downstream analytics, you must first create a connection to a data destination using a sink connector.
This is step 6 of 6 for using Decodable to process and send data to a destination.
To see an overview of all of the steps, see Quick start: Process and route data using Decodable.

Every sink connector has its own unique set of properties, and the exact fields you'll be asked to provide vary depending on the data destination. You can get data from Decodable into the following data destinations:
- Apache Druid
- Apache Hudi
- Apache Pinot
- AWS Kinesis
- AWS Redshift
- AWS S3
- ClickHouse
- Datadog
- DataStax Astra Streaming
- Delta Lake
- Elasticsearch
- Google Cloud Platform Pub/Sub
- Imply Polaris
- InfluxDB
- MongoDB
- Oracle
- Postgres
- Redis
- RedPanda
- Rockset
- Singlestore DB
- Snowflake
- StarTree
- StreamNative
- Timescale DB
Select one of the links above to get detailed guidance on how to connect to that data source.
Updated about 2 months ago