Running Camel Kafka Connector for IBM Event Streams

Zhimin Wen
4 min readAug 27
Image by Ulli from Pixabay

Camel Kafka Connectors as a standard Kafka connector greatly increases the capability of integration of Kafka. There is about 164 connectors available as of now.

Let’s explore using these connectors with IBM Event Streams.

Build the Connector Image

Let’s build the connector image manually to have lesser magic and to separate the build and the running environment.

Download the Camel Kafka Connectors, see the file connector and the http source connector,

curl -LO
curl -LO

Create a directory, untar the files in this directory. Notice each connector will have its own directory from the tar file structure.

mkdir -p my-plugins
tar xf ../camel-file-kafka-connector-3.18.2-package.tar.gz
tar xf ../camel-http-source-kafka-connector-3.18.2-package.tar.gz

Login to the IBM registry with your entitlement, then build the image with the following Dockerfile,

COPY ./my-plugins/ /opt/kafka/plugins/
USER 1001

Push the image into the OCP private registry with the target namespace to run the connector, say k-connector

Create the Connect Cluster

Create the KafkaConnect CR resource with the following YAML

kind: KafkaConnect
name: my-connect-cluster
namespace: k-connector
annotations: "true"
imagePullSecrets: []
eventstreams.production.type: CloudPakForIntegrationNonProduction
productID: 2a79e49111f44ec3acd89608e56138f5
productName: IBM Event Streams for Non Production
productVersion: 11.2.3