Generate connection definitions The connection scan CLI command creates resource definitions for connectors that support reading from or writing to multiple resources. The generated definitions can then be used to create a connection using the apply command. For a source connection, connection scan will fetch a list of objects in the source system that match an optional pattern. It will output a complete connection resource definition with the connection properties that were passed to connection scan, along with the objects from the source system mapped to Decodable streams with their associated resource definitions. For a sink connection, connection scan will output a complete connection resource definition with the connection properties that were passed to connection scan, along with the streams that match an optional pattern mapped to objects in the target system. connection scan uses the connection details passed to it under the prop values to connect to the external system. Any sensitive values must be passed as secrets. The secret must already exist before it’s used with connection scan. Secrets can be referenced by either name or ID. If an upstream resource changes, you can use connection scan to obtain an up-to-date specification, then stop the connection and use apply to update it. Supported Connectors connection scan is only available for the following connectors: Microsoft SQL Server source & sink MySQL source & sink Oracle source & sink Postgres source & sink Snowflake sink Usage The output of connection scan is a YAML file. Here is an example that will write a connection resource definition to file that includes all tables in schema petstore: decodable connection scan \ --name petstore \ --connector mysql-cdc \ --type source \ --prop hostname="db.acme-corp.com" \ --prop username=etl \ --prop password="db-etl-pw" \ --include-pattern schema-name='^petstore$' \ > petstore-connection.yaml From here you can review the contents of petstore-connection.yaml and customize as needed, before using apply to create the connection itself. You can also pipe the output directly into apply so that the connection is created straight away: decodable connection scan \ --name petstore \ --connector mysql-cdc \ --type source \ --prop hostname="db.acme-corp.com" \ --prop username=etl \ --prop password="db-etl-pw" \ --include-pattern schema-name='^petstore$' \ | decodable apply - Syntax Field Required? Description Connection resource definition parameters name Required The name of the connection to define. description Optional The description of the connection. connector Required The name of the source or sink connector. See the relevant connector page for the value to use here. Possible values include mysql-cdc, postgres-cdc, and snowflake, along with others. type Required: The type of the connection: source or sink. prop Optional A series of connection properties. Format <connector property key>=<value>. Consult the documentation for the connector that you are using for details of the properties. For example: --prop hostname=db.acme-corp.com --prop username=my_user Scan parameters include-pattern Optional Regular expression by which to filter input resources. Format <resource specifier key>=<value>. Multiple patterns can be specified. For a source connection refer to the connector’s documentation page for the specific resource specifier keys available for filtering. The following would match any table name that ends with test and is in a database whose name begins with poc- and has one or more numerical digits after it: --include-pattern database-name='^poc-[0-9]+$' \ --include-pattern table-name='test$' For a sink connection you can only filter by stream-name. For example, to include only objects in a sink connection from streams that begin with warehouse-, use: --include-pattern stream-name='^warehouse-' output-resource-name-template Optional Template to use for determining output resource names. Format <resource specifier key>=<value>. For a source connection the output resource is a Decodable stream, and the resource specifier key is stream-name. The available template variables are the resource specifier keys listed on the connector’s documentation page. The following would write tables from a source MySQL connection to Decodable streams prefixed with epos- and then the schema and table name: --output-resource-name-template stream-name="epos-{schema-name}-{table-name}" For a sink connection the output resource is an object in the target system. Refer to the connector’s documentation page for the specific resource specifier keys available. The only available template variable is stream-name. For example, to write Decodable streams to a set of SQL Server tables matching the stream name but prefixed with test- use: --output-resource-name-template table-name="test-{stream-name}" opt Optional Options for the external resource scan. Format <option name>=<value>. Currently supported for with-metadata-fields=true to add metadata to the stream definition. See the connector’s documentation page for metadata field details. Currently only applicable to the Microsoft SQL Server and MySQL source connectors