Oracle
Oracle database services and products offer customers cost-optimized and high-performance versions of Oracle Database, the world's leading converged, multi-model database management system, as well as in-memory, NoSQL and MySQL databases. Oracle Autonomous Database, available on premises via Oracle Cloud@Customer or in the Oracle Cloud Infrastructure, enables customers to simplify relational database environments and reduce management workloads.
Getting Started
Sending a Decodable data stream to Oracle is accomplished in two stages, first by creating a sink connector to a data source that is supported by Oracle, and then by adding that data source to your Oracle configuration. Decodable and Oracle mutually support several technologies, including Apache Kafka.
Configure As A Sink
This example demonstrates using Kafka as the sink from Decodable and the source for Oracle. Sign in to the Decodable Web Console and follow the configuration steps provided for the Kafka Connector to create a sink
connector. For examples of using the command line tools or scripting, see the How To guides.
Create Kafka Data Source
Oracle Transactional Event Queue (TEQ) makes it easy to implement event-based applications. It is also highly integrated with Apache Kafka, an open-source stream-processing software platform developed by LinkedIn and donated to the Apache Software Foundation, written in Scala and Java. Apart from enabling apps that use Kafka APIs to transparently operate on Oracle TEQ, Oracle TEQ also supports bi-directional information flow between TEQ and Kafka, so that changes are available in TEQ or Kafka as soon as possible in near-real-time.
The Kafka Connect uses Java Naming and Directory Interface (JNDI) and JMS standard interface to create an JMS ConnectionFactory instance for the Oracle TEQ and then enqueue or dequeue messages to/from TEQ correspondingly. The prerequisites are as follows:
- Kafka Broker: Confluent Platform 3.3.0 or above, or Kafka 0.11.0 or above
- Connect: Confluent Platform 4.1.0 or above, or Kafka 1.1.0 or above
- Java 1.8
- Oracle TEQ JMS 1.1+ Client Jars
Steps for message transfer from Apache Kafka to TEQ are as follows.
-
Start Oracle Database
-
Setup Transactional Event Queue (TEQ)
-
Install Kafka Connect Sink Component
-
Import TEQ Jars into Kafka JMS Sink Connector
-
Start Confluent Platform
-
Configure JMS Sink Connector
-
Load the JMS Sink Connector
-
Post-Check Connector Status
-
Test Message Transfer
For more detailed information, please refer to Oracle's Kafka documentation.
Reference
Connector name | oracle |
Type | sink |
Additional Oracle Support
If you are interested in a direct Decodable Connector for Oracle, please contact [email protected] or join our Slack community and let us know!
Updated 10 months ago