Definitions This is the reference for the apply input and query output YAML format. If you are new to declarative resource management in Decodable see the overview. Each resource definition takes the form of a YAML document, which begins with a line of three hyphens ---. A YAML file is a series of YAML documents. New to YAML? This excellent tutorial covers the basics and advanced concepts. For a complete example with a set of resource definitions that work together see here. Top-level structure Each resource definition YAML doc has these fields: --- kind: <resource_type> metadata: name: <name_your_resource> description: <resource_description> spec_version: v1 spec: <resource_specifications> Field Required? Description kind Required The kind of resource. One of: secret, connection, pipeline, stream. metadata Required Values that identify or describe the resource, including name and description text strings. The name value is mapped to an ID when the YAML is applied. The description field is optional. spec_version Required: v1 Determines spec format and processing logic for this kind. Currently always v1. spec Required, except with kind: secret The resource’s specifications. The format of the spec object is different for each Decodable resource kind. It contains nested fields specific to that resource. The query output format extends this resource definition structure with a status section. The status contains non-writable fields managed by Decodable for the resource, and is ignored by apply. Resource definitions A resource definition YAML doc template is given below for each kind, each with a description of the contents of the spec field. Connection A template for a connection definition: --- kind: connection metadata: name: <name_your_connection> description: <description> spec_version: v1 spec: connector: <connector_name> type: <type of connector> properties: <property_1>: <value> <property_2>: <value> stream_name: <stream> schema_v2: fields: - kind: physical name: <field name> type: <data type> - kind: computed name: <field name> expression: <data type> - kind: metadata name: <field name> type: <data type> key: <per-connector key> Field Description spec.connector The name of the connector. For example: mysql-cdc or s3-v2. spec.type The type of connector. Enter source if your connector receives data or sink if your connector sends data. spec.properties Each connector has its own set of properties that you can specify. To determine the property names and their valid values for a specific connector, refer to the corresponding documentation page for that connector. Important: Secret properties, such as passwords, use the name of a secret resource, not the actual password or secret plaintext value. spec.stream_name The name of the stream that you want the connection to send or receive data from. spec.schema_v2 The schema of the connection. This must be compatible with the schema of the stream that it’s connected to. spec.schema_v2.fields[*].kind One of: physical, computed, metadata. For more information on how to use each field kind, see here. Stream A template for a stream definition: --- kind: stream metadata: name: <name_your_stream> description: <description> spec_version: v1 spec: schema_v2: fields: - kind: physical name: <field name> type: <data type> - kind: computed name: <field name> expression: <data type> constraints: # optional primary_key: - <field name> watermarks: # optional - name: <name> expression: <watermark expression> Field Required? Description spec.schema_v2 Required The schema of the stream. This must be compatible with the schema of any connection that it’s connected to, and any pipeline that uses it. spec.schema_v2.fields[*].kind Required One of: physical, computed. For more information on how to use each field kind, see here. spec.schema_v2.constraints.primary_key Optional Specify one or more fields to use as a primary key or partition key. You must (only) specify a primary key if you are sending Change Data Capture (CDC) records. For more information on partition keys or primary keys, see the table in How to use the schema view. spec.schema_v2.watermarks Optional A field that can be used to track the progression of event time in a pipeline. Secret A template for a secret definition: --- kind: secret metadata: name: <name_your_secret> description: <description> spec_version: v1 spec: # Use _one_ of the following value_* fields. value_env_var: MY_PASSWORD value_file: ./secret.txt value_literal: "don't-tell-123" A Decodable Secret resource stores a plaintext value (such as a password) securely, so that it may be used by Connections or Custom Pipelines. A Connection references a Secret by name from a (secret-type) property value. The apply CLI command can set this plaintext value from an environment variable, a file, or a literal value, via special spec fields. These fields can only be used to write the plaintext value: the query command output for a Secret will not include them. The fields aren’t considered part of the Secret resource. Instead they act as directives to the CLI apply command to write the plaintext value to the secure Secret storage in your Decodable account. Field Description spec.value_env_var The name of an environment variable containing the plaintext value. This is recommended for a simple plaintext value string such as a password. Note that not all binary values can be stored in an environment variable, and that some shell-based techniques can strip newlines when setting an environment variable. The apply command will error if no environment variable by this name is set. spec.value_file A path (relative to the YAML file’s directory) to a local file containing the plaintext value. This is recommended for a multiline string (such as a private key) or binary plaintext value. The file contents are used verbatim. Note that this includes any trailing newlines. The apply command will error if no such file can be found and read. spec.value_literal A string with the literal plaintext value. Be cautious storing any secret value this way. Care must be taken to ensure that any YAML file that uses this field isn’t committed to source control, emailed, sent over Slack, logged, etc. This field may be appropriate for private ad-hoc use, and for use with generated YAML. The other two fields are recommended for most CI/CD use. For API users: These fields are specially handled by the CLI apply command; the underlying /apply API doesn’t directly support them. Secret definition without plaintext value When no spec.value_* field is present—as in query command output—no plaintext value is written for the Secret resource on apply. In this case there is no spec for the Secret definition, but spec_version: v1 is still required. For a Secret created this way, the plaintext value may be manually set after apply, as a separate step. This can be done via the Decodable UI or CLI (decodable secret write <id> --value '<value>'). It need only be done once per Secret. Alternately, a spec.value_* field may be added and apply repeated. If the Secret resource already exists and no spec.value_* field is provided then the existing plaintext value will remain unchanged. This may be appropriate for certain use cases. Pipeline SQL Pipeline A template for a SQL Pipeline definition: --- kind: pipeline metadata: name: <name_your_pipeline> description: <description> spec_version: v1 spec: sql: | INSERT INTO stream_out SELECT ... FROM stream_in Field Required? Description spec.sql Required A SQL statement in the form: INSERT INTO <output-stream> SELECT … FROM <input-stream> For example: INSERT INTO stream_out SELECT LOWER(col1) FROM stream_in Custom Pipeline A template for a Custom Pipeline definition. All referenced files are assumed to be available in the same location as the YAML doc: --- kind: pipeline metadata: name: <name_your_pipeline> description: <description> spec_version: v1 spec: type: JAVA job_file_path: pipeline.jar config_file_paths: - config.yaml - example.data properties: secrets: - <secret_name> flink_version: 1.18-java11 additional_metrics: - some.flink.metric Field Required? Description spec.type Required Job file type. Either JAVA or PYTHON. spec.job_file_path Either this or job_file_sha is required Path to job file (a JAR for JAVA, a ZIP for PYTHON) in local file system. CLI will upload this file as needed. Not available if invoking the API directly. spec.job_file_sha Either this or job_file_path is required SHA-512 of contents of job file. Must match an already uploaded job file. spec.job_arguments Optional Argument string input to the Custom Pipeline program. spec.entry_class Optional Entry class of the Custom Pipeline. If not provided, the entry class must be specified in the file META-INF/MANIFEST.MF in the pipeline’s JAR file, using the Main-Class property key. spec.config_file_paths Optional List of strings, each a path to a config file in local file system. CLI will upload these files as needed. Not available if invoking the API directly. spec.config_files Optional List of objects (as follows). spec.config_files[*].name Optional Name of config file as exposed to the Custom Pipeline program. spec.config_files[*].sha Required SHA-512 of contents of config file. Must match an already uploaded config file. spec.properties Required Object (as follows). spec.properties.secrets Optional List of strings, each a Secret name (or ID). spec.properties.flink_version Required A specific Decodable-supported Flink version (string). spec.properties.additional_metrics Optional List of strings, each a Flink metric to expose in the _metrics stream for this Custom Pipeline.