Imply Polaris

Imply Polaris® is a real-time database for modern analytics applications, built from Apache Druid® and delivered as a fully managed database as a service (DBaaS). It provides a complete, integrated experience that simplifies everything from ingesting data to using visualizations to extracting valuable insights from your data.

Imply Polaris vs. Druid

Imply Polaris can be used as an alternative to a self-managed Druid deployment. So, Decodable's Imply Polaris connector is the alternative to using the Druid sink connector. Unlike the Druid connector, no Kafka is necessary to work with Polaris.

Key benefits of Polaris include:

  • A fully managed cloud service. You do not have to configure and run your own Kafka data sources to ingest data to Polaris (as you would need to with Druid). Just point, click, and stream.
  • A single development experience, with push-based streaming built on Confluent Cloud.
  • Database optimization.
  • Scale in seconds.
  • Resiliency and security.

Configure As A Sink

Imply Polaris is a sink connection, meaning that data can only be written from Decodable to Polaris. Sending a Decodable data stream to Imply Polaris can be accomplished in four main steps:

  1. Create a table in Polaris.
  2. Create a Decodable stream to ingest to the Polaris table created in step 1.
  3. Create a push_streaming connection and a streaming job to your Polaris table.
    1. NOTE 1: connections in Polaris are distinct from connections in Decodable. See below for documentation on Polaris connections).
    2. NOTE 2: please ensure the input schema you specify in the streaming job matches the schema of the Decodable stream you are sending to Polaris. Your data may not appear in your Polaris table otherwise.
  4. Create an Imply Polaris connection in Decodable by specifying:
    1. the Decodable stream you would like to send to Polaris.
    2. the Polaris connection name you created.
    3. your Polaris organization name.
    4. your Polaris API client ID.
    5. the secret associated with your API client ID.

Start the connection and observe data flow into your Polaris table.

For more detailed information, refer to the following:

Example

Let's walk through an example of the four steps above. We will use the Polaris REST API to illustrate connection and job creation in Polaris. We will also use the Decodable REST API to illustrate creating the sink connection to Imply Polaris. (NOTE: this workflow can also be done via the UI/CLI of both services as well).

  1. For simplicity sake, let's say we've already created a Polaris table called http_requests with the following schema:
       |        |               |               |
__time | method | original_path | repsonse_code | bytes_rcvd
       |        |               |               |   
  1. Let's also say we've already created a Decodable stream called http_requests_stream with the following schema:
          |        |               |               |
timestamp | method | original_path | repsonse_code | bytes_rcvd
          |        |               |               |   

Note the field name difference between __time in the Polaris table and timestamp in the Decodable stream. This will be reconciled in step 3.

  1. Using the Polaris REST API, create a push_streaming connection and streaming job as follows:
curl --location --request POST 'https://ORGANIZATION_NAME.api.imply.io/v2/connections' \
--header 'Authorization: Bearer $IMPLY_TOKEN' \
--header 'Content-Type: application/json' \
--data-raw '{
  "type": "push_streaming",
  "name": "demo_connection"
}'

curl --location --request POST 'https://ORGANIZATION_NAME.api.imply.io/v2/jobs' \
--header "Authorization: Bearer $IMPLY_TOKEN" \
--header 'Content-Type: application/json' \
--data-raw '{
    "type": "streaming",
    "target": {
        "type": "table",
        "tableName": "http_requests"
    },
    "source": {
        "type": "connection",
        "connectionName": "demo_connection",
        "inputSchema": [
            {
                "name": "timestamp",
                "dataType": "string"
            },
            {
                "name": "method",
                "dataType": "string"
            },
            {
                "name": "original_path",
                "dataType": "string"
            },
            {
                "name": "response_code",
                "dataType": "long"
            },
            {
                "name": "bytes_rcvd",
                "dataType": "long"
            }
        ],
        "formatSettings": {
            "format": "nd-json"
        }
    },
    "mappings": [
        {
            "columnName": "__time",
            "expression": "TIME_PARSE(\"timestamp\")"
        },
        {
            "columnName": "method",
            "expression": "\"method\""
        },
        {
            "columnName": "original_path",
            "expression": "\"original_path\""
        },
        {
            "columnName": "response_code",
            "expression": "\"response_code\""
        },
        {
            "columnName": "bytes_rcvd",
            "expression": "\"bytes_rcvd\""
        }
    ]
}'
  1. Using Decodable REST API, create the sink connection to Imply Polaris, specifying all the required fields to properly connect to your Polaris project.
curl --request POST \
     --url https://<your_decodable_organization>.api.decodable.co/v1alpha2/connections \
     --header 'Accept: application/json' \
     --header 'Authorization: Bearer $DECODABLE_AUTH_TOKEN' \
     --header 'Content-Type: application/json' \
     --data '
{
     "connector": "polaris",
     "properties": {
          "polaris.client-id": "<your_client_ID>",
          "polaris.connection-name": "<your_connection_name>",
          "polaris.org-name": "<your_org_name>",
          "polaris.client-secret": "<your_client_secret>"
     },
     "name": "imply_polaris_demo",
     "description": "Send data to Imply Polaris from Decodable",
     "type": "sink",
     "stream_id": "<http_requests_stream ID>"
}
'

Start the imply_polaris_demo connection from the REST API/UI/CLI and sit back, pet your cat, take a walk, or start working with your data in Polaris!

Things to Note

  • An event timestamp field must exist in your Decodable stream schema in order for data to be persisted in your Polaris table.
  • If you are using Decodable to send data to a Polaris v1 table, use the Table ID wherever Decodable asks you for the Connection Name.

Reference

Connector namepolaris
Typesink
Delivery guaranteeat least once

Apache Kafka, Kafka®, Apache®, Druid®, and associated open source project names are either registered trademarks or trademarks of The Apache Software Foundation.

Did this page help you?