Oracle sink connector

Oracle database services and products offer customers cost-optimized and high-performance versions of Oracle Database, the world’s leading converged, multi-model database management system, as well as in-memory, NoSQL, and MySQL databases. Oracle Autonomous Database, available on premises via Oracle Cloud@Customer or in the Oracle Cloud Infrastructure, enables customers to simplify relational database environments and reduce management workloads.

If you are interested in a direct Decodable Connector for Oracle, contact support@decodable.co or join our Slack community and let us know!

Getting started

Sending a Decodable data stream to Oracle is accomplished in two stages, first by creating a sink connector to a data source that is supported by Oracle, and then by adding that data source to your Oracle configuration. Decodable and Oracle mutually support several technologies, including Apache Kafka.

Configure as a sink

This example demonstrates using Kafka as the sink from Decodable and the source for Oracle. Sign in to Decodable Web and follow the configuration steps provided in the Apache Kafka sink connector topic to create a sink connector. For examples of using the command line tools or scripting, see the How To guides.

Create Kafka data source

Oracle Transactional Event Queue (TEQ) makes it easy to implement event-based applications. It is also highly integrated with Apache Kafka, an open-source stream-processing software platform developed by LinkedIn and donated to the Apache Software Foundation, written in Scala and Java. Apart from enabling apps that use Kafka APIs to transparently operate on Oracle TEQ, Oracle TEQ also supports bi-directional information flow between TEQ and Kafka, so that changes are available in TEQ or Kafka as soon as possible in near-real-time.

The Kafka Connect uses Java Naming and Directory Interface (JNDI) and JMS standard interface to create an JMS ConnectionFactory instance for the Oracle TEQ and then enqueue or dequeue messages to/from TEQ correspondingly. The prerequisites are as follows:

  • Kafka Broker: Confluent Platform 3.3.0 or above, or Kafka 0.11.0 or above

  • Connect: Confluent Platform 4.1.0 or above, or Kafka 1.1.0 or above

  • Java 1.8

  • Oracle TEQ JMS 1.1+ Client Jars

Steps for message transfer from Apache Kafka to TEQ are as follows.

  1. Start Oracle Database

  2. Setup Transactional Event Queue (TEQ)

  3. Install Kafka Connect Sink Component

  4. Import TEQ Jars into Kafka JMS Sink Connector

  5. Start Confluent Platform

  6. Configure JMS Sink Connector

  7. Load the JMS Sink Connector

  8. Post-Check Connector Status

  9. Test Message Transfer

For more detailed information, see Oracle’s Kafka documentation.