Member-only story
Running Camel Kafka Connector for IBM Event Streams

Camel Kafka Connectors as a standard Kafka connector greatly increases the capability of integration of Kafka. There is about 164 connectors available as of now.
Let’s explore using these connectors with IBM Event Streams.
Build the Connector Image
Let’s build the connector image manually to have lesser magic and to separate the build and the running environment.
Download the Camel Kafka Connectors, see the file connector and the http source connector,
curl -LO https://repo.maven.apache.org/maven2/org/apache/camel/kafkaconnector/camel-file-kafka-connector/3.18.2/camel-file-kafka-connector-3.18.2-package.tar.gz
curl -LO https://repo.maven.apache.org/maven2/org/apache/camel/kafkaconnector/camel-http-source-kafka-connector/3.18.2/camel-http-source-kafka-connector-3.18.2-package.tar.gz
Create a directory, untar the files in this directory. Notice each connector will have its own directory from the tar file structure.
mkdir -p my-plugins
tar xf ../camel-file-kafka-connector-3.18.2-package.tar.gz
tar xf ../camel-http-source-kafka-connector-3.18.2-package.tar.gz
Login to the IBM registry with your entitlement, then build the image with the following Dockerfile,
FROM cp.icr.io/cp/ibm-eventstreams-kafka:11.2.3
COPY ./my-plugins/ /opt/kafka/plugins/
USER 1001
Push the image into the OCP private registry with the target namespace to run the connector, say k-connector
Create the Connect Cluster
Create the KafkaConnect CR resource with the following YAML
apiVersion: eventstreams.ibm.com/v1beta2
kind: KafkaConnect
metadata:
name: my-connect-cluster
namespace: k-connector
annotations:
eventstreams.ibm.com/use-connector-resources: "true"
spec:
template:
pod:
imagePullSecrets: []
metadata:
annotations:
eventstreams.production.type: CloudPakForIntegrationNonProduction
productID: 2a79e49111f44ec3acd89608e56138f5
productName: IBM Event Streams for Non Production
productVersion: 11.2.3
productMetric: VIRTUAL_PROCESSOR_CORE…