Web Quickstart Guide

This guide steps you through an end-to-end example of how to use Decodable to parse and structure Envoy logs in real-time. It should give you enough exposure to the platform that you can get rolling with a wide array of use cases in short order. We’ll assume that you’ve already created your account and logged in successfully. For your convenience, we've created some resources in your account to get you started.

In this guide, we’ll use the Decodable web app. If you’re interested in using the command line interface instead, see the CLI Quickstart Guide.


Connections are reusable connections to your data infrastructure. They pipe data from an external system to a Decodable stream or vice versa.

First, let's click the Connections tab on the top bar to check out the connections in your account, one should already be pre-created for you:


The connection we see here uses the 'datagen' connector, which generates test data. Its type is 'source', meaning this connector feeds new data from an external system into a stream. Let's activate the connection to get some data flowing. Click on the Start button inside the '…' menu to the right of the connection:


You’ll soon see the Actual State of the connection transition to 'Running'. At this point, the test data is outputting to a stream called envoy_raw which has been pre-created for you.

The data will have a field value with a type of STRING. The content emulates raw http event logs in JSON format as in the example below:

{"value":"[2021-11-05T04:48:03Z] \"GET /products/3 HTTP/1.1\" 500 URX 2001 6345 82 32 \"-\" \"Mozilla/5.0 (Windows NT 10.0; Win64; x64)\" \"5c70092a-ed05-4d0c-9d2b-f7bf361ad17e\" \"localhost\" \"\""}
{"value":"[2021-11-05T04:48:03Z] \"DELETE /users/1 HTTP/2.0\" 200 NC 4044 3860 41 39 \"-\" \"Mozilla/5.0 (Linux; Android 10) \" \"5ca9fd79-afee-44db-9352-2ee9949dc6df\" \"aws.gateway\" \"\""}
{"value":"[2021-11-05T04:48:03Z] \"DELETE /products/2 HTTP/2.0\" 500 UH 3826 8831 14 33 \"-\" \"Mozilla/5.0 (Linux; Android 10) \" \"5f0ae73d-c76b-471f-9458-3efc45128509\" \"aws.gateway\" \"\""}

Now that we have some data flowing, let’s see what we can do with it. Click on the datagen_envoy_connection:


Click on Outbound to 1 stream - envoy_raw button to view the stream we are connected to:



You will now see the envoy_raw stream overview with a preview of the data. Above the sample data, the button on the left shows that the inbound data comes from the datagen_envoy_connection. The one on the right shows that this stream currently has no output. Click on Create a pipeline inside the '…' menu on the right of the output button:



Now we’re ready to build our pipeline.

Here, you’ll be presented with a window with three panes. The upper right pane is a text box where you can author your pipeline SQL. A simple INSERT query has been pre-constructed for you based on the input stream.

The lower right pane gives you access to Decodable’s preview feature that gives you real-time visibility into the contents of your streams and lets you quickly iterate on pipeline designs. Click Run Preview to preview your new pipeline. Your results should appear in under a minute:


You should see JSON records with a single field named 'value'. Its values are plaintext log entries of API activity in Envoy’s logging format:


In order to make use of these logs, we’ll first need to parse them. Try copy-pasting the following SQL statement into the pipeline SQL pane and click Run Preview again:

-- Extract Envoy fields from a map as top level fields and insert them into the
-- http_events stream.
INSERT INTO http_events
  CAST(envoy['timestamp'] AS STRING) AS `timestamp`,
  CAST(envoy['method']    AS STRING) AS `method`
  FROM (
    -- Match and parse Envoy records in the value field of the envoy_raw stream.
    -- grok() produces a map<field name, value> we call envoy.
        '\[%{TIMESTAMP_ISO8601:timestamp}\] "%{DATA:method}"'
      ) AS envoy
    FROM envoy_raw

Now instead we see JSON with neatly parsed fields with sensible names and types, click Next:


The name of the output stream and the names and types of the fields in the stream’s schema are based on the earlier SQL statement. Give the stream a description of your choosing like 'parsed Envoy logs', and click Create Stream and then next:


Give this new pipeline a name and description of your choosing

Go ahead and click Create Pipeline:


You’ll be brought to the pipeline overview page, showing the pipeline in a Stopped state. Go ahead and click Start:


The pipeline’s state should immediately change to Starting. It will settle into the Running state in a minute or two once infrastructure has been provisioned and data is flowing. You’ll see the pipeline’s metrics start to roll in as well:


Congratulations! You’re now ready to build and run your own pipelines using our test data or configure connections to your own data infrastructure to get started in earnest. Thanks for test-driving Decodable!

What’s Next