Manage Schemas
A schema defines the structure of the records flowing through Decodable. It specifies the fields that the records contain and the data types of those fields. Decodable uses schemas to validate that the data being ingested adheres to the expected format. This means that the schema of a stream must match the schema of the connection that it is attached to.
You can view the schema of a connection or stream by doing the following.
- Navigate to the Streams or Connections page.
- Select the stream or connection that you want to view the schema for. The Overview page for that resource opens.
- Select the Schema tab to view the schema for the resource.
How to use the Schema view
Use the Schema page to define the schema that your data has. Schemas help ensure that the data flowing through Decodable is consistent and accurate even while it's being processed and handled by different systems and applications.
Performing schema migrations or editing a schema that is currently being used by an active connection
If you want to perform a schema migration or make any changes to a schema that is attached to an actively running connection, make sure that you review Schema Migrations topic before continuing.
The following screen image and accompanying table describe the various components found on the Schema page.

Number | Name | Description |
---|---|---|
1 | Partition Key Fields / Primary Key Fields | One or more fields to use as either the partition key or the primary key.
|
2 | Name | The field name. |
3 | Type | The field type. For a list of supported data types, see Decodable Data Types. |
4 | Partition Key / Primary Key | Select this icon if you want to use this field as either the partition key or primary key. If you want to use the field as a primary key, you will also need to specify that the type is not null explicitly by entering: <type> NOT NULL . For example: BIGINT NOT NULL . |
5 | Watermark | A field that can be used to track the progression of event time in a pipeline. You can also include an optional interval to allow for late arriving data. The field must be one of the following types: TIMESTAMP(3) , TIMESTAMP(2) , TIMESTAMP(1) , TIMESTAMP_LTZ(3) , TIMESTAMP_LTZ(2) , or TIMESTAMP_LTZ(1) .For example: - "timestamp_field AS timestamp_field" - "timestamp_field AS timestamp_field INTERVAL '0.001' SECOND" - "timestamp_field AS timestamp_field - INTERVAL '5' MINUTE" |
6 | Data Type Options | One of the following:
|
7 | Delete | Removes the field from the stream schema. |
8 | Import Schema | Infer the schema based on a subset of data. Select this button if you want to upload a sample of your JSON or AVRO-formatted data and infer the schema based on the provided sample. |
Updated 15 days ago