Web Quickstart Guide

This guide steps you through an end-to-end example of how to use Decodable to parse and structure Envoy logs in real-time. It should give you enough exposure to the platform that you can get rolling with a wide array of use cases in short order. We’ll assume that you’ve already created your account and logged in successfully. For your convenience, we've created some resources in your account to get you started.

In this guide, we’ll use the Decodable web app. If you’re interested in using the command line interface instead, see the CLI Quickstart Guide.

Connections

Connections are reusable connections to your data infrastructure. They pipe data from an external system to a Decodable stream or vice versa.

First, let's click the Connections tab on the top bar to check out the connections in your account - one should already be pre-created for you:

22122212

The connection we see here uses the 'datagen' connector, which generates test data. Its type is 'source', meaning this connector feeds new data from an external system into a stream. Let's activate the connection to get some data flowing. Click on the Start button inside the '…' menu to the right of the connection:

22362236

You’ll soon see the Actual State of the connection transition to 'Running'. At this point, the test data is being output to a stream called envoy_raw which has been pre-created for you.

The data will have a field value with a type of STRING. The content emulates raw http event logs in JSON format as in the example below.

{"value":"[2021-11-05T04:48:03Z] \"GET /products/3 HTTP/1.1\" 500 URX 2001 6345 82 32 \"-\" \"Mozilla/5.0 (Windows NT 10.0; Win64; x64)\" \"5c70092a-ed05-4d0c-9d2b-f7bf361ad17e\" \"localhost\" \"192.168.0.11:443\""}
{"value":"[2021-11-05T04:48:03Z] \"DELETE /users/1 HTTP/2.0\" 200 NC 4044 3860 41 39 \"-\" \"Mozilla/5.0 (Linux; Android 10) \" \"5ca9fd79-afee-44db-9352-2ee9949dc6df\" \"aws.gateway\" \"10.0.0.1\""}
{"value":"[2021-11-05T04:48:03Z] \"DELETE /products/2 HTTP/2.0\" 500 UH 3826 8831 14 33 \"-\" \"Mozilla/5.0 (Linux; Android 10) \" \"5f0ae73d-c76b-471f-9458-3efc45128509\" \"aws.gateway\" \"10.0.0.1\""}

Streams

Now that we have some data flowing, let’s see what we can do with it. Before we can create a pipeline that reads this data, we need to create a stream that the pipeline can output to. Click on the Streams tab in the upper bar and then click the New Stream button on the right side:

22262226

Here we specify the names and types of the fields in the stream’s schema. Let’s define 'timestamp' as a string and click Add Field, and name the new field 'method' as a string and click Next:

14751475

Name your stream 'http_events', give it a description of your choosing like 'parsed Envoy logs', and click Create Stream:

14901490

Pipelines

Now that we have a target stream to output the parsed data, we’re ready to build our pipeline. Click the Pipelines tab along the top navigation bar and then click Create Your First Pipeline:

22342234

Give this new pipeline a name and description of your choosing and then click Next:

22302230

For an input Stream, select the envoy_raw stream containing our generated test data. Click Next:

22322232

As an output stream, select the http_events stream that we just created. Go ahead and click Next:

22342234

Here, you’ll be presented with a text box where you can author your pipeline SQL. A simple INSERT query has been pre-constructed for you based on the input and output streams you selected.

22222222

This screen also gives you access to Decodable’s preview feature that gives you real time visibility into the contents of your streams and lets you quickly iterate on pipeline designs. Click Run Preview to preview your new pipeline. Your results should appear in under a minute:

22282228

You should see JSON records with a single field named 'value'. Its values are plaintext log entries of API activity in Envoy’s logging format. In order to make use of these logs, we’ll first need to parse them. Try copy-pasting in this SQL instead and click Run Preview again:

-- Extract Envoy fields from a map as top level fields and insert them into the
-- http_events stream.
INSERT INTO http_events
SELECT
  CAST(envoy['timestamp'] AS STRING) AS `timestamp`,
  CAST(envoy['method']    AS STRING) AS `method`
  FROM (
    -- Match and parse Envoy records in the value field of the envoy_raw stream.
    -- grok() produces a map<field name, value> we call envoy.
    SELECT
      grok(
        `value`,
        '\[%{TIMESTAMP_ISO8601:timestamp}\] "%{DATA:method}"'
      ) AS envoy
    FROM envoy_raw
)
22282228

Now we instead see JSON with neatly parsed fields with sensible names and types.

Go ahead and click Create Pipeline. You’ll be brought to the pipeline details page, showing the pipeline in Stopped state. Go ahead and click Start in the upper-right hand corner:

22362236

The pipeline’s state should immediately change to Starting. It will settle into the Running state in a minute or two once infrastructure has been provisioned and data is flowing. You’ll see the pipeline’s metrics start to roll in as well:

22362236

Congratulations! You’re now ready to build and run your own pipelines using our test data or configure connections to your own data infrastructure to get started in earnest. Thanks for test driving Decodable!


Did this page help you?