Kafka Open Metadata Topic Connector

The Kafka Open Metadata Topic Connector implements an Apache Kafka connector for a topic that exchanges Java Objects as JSON payloads.

Default Configuration


(see Apache Kafka producer configurations for more information and options)

Property Name Property Value
bootstrap.servers localhost:9092
acks all
retries 1
batch.size 16384
linger.ms 0
buffer.memory 33554432
max.request.size 10485760
key.serializer org.apache.kafka.common.serialization.StringSerializer
value.serializer org.apache.kafka.common.serialization.StringSerializer


(see Apache Kafka consumer configurations for more information and options)

Property Name Property Value
bootstrap.servers localhost:9092
enable.auto.commit true
auto.commit.interval.ms 1000
session.timeout.ms 30000
max.partition.fetch.bytes 10485760
key.deserializer org.apache.kafka.common.serialization.StringDeserializer
value.deserializer org.apache.kafka.common.serialization.StringDeserializer


By default kafka security is not configured. The exact configuration may depend on the specific kafka service being used. Service specific notes are below. They may work for other providers, and feedback is welcome so that this documentation can be updated accordingly.

IBM Event Streams on IBM Cloud

There are 2 key pieces of information that are provided in the documentation for your configured cloud service

Given these, configure kafka properties for both provider and consumer as follows:

"broker.list: "broker-5-uniqueid.kafka.svcnn.region.eventstreams.cloud.ibm.com:9093, broker-3-uniqueid.kafka.svcnn.region.eventstreams.cloud.ibm.com:9093, broker-2-uniqueid.kafka.svcnn.region.eventstreams.cloud.ibm.com:9093, broker-0-uniqueid.kafka.svcnn.region.eventstreams.cloud.ibm.com:9093, broker-1-uniqueid.kafka.svcnn.region.eventstreams.cloud.ibm.com:9093, broker-4-uniqueid.kafka.svcnn.region.eventstreams.cloud.ibm.com:9093"
"sasl.jaas.config":"org.apache.kafka.common.security.plain.PlainLoginModule required username='token' password='MYAPIKEY';",

An example of a use of this configuration can be found in the virtual data connector helm charts. See odpi-egeria-vdc helm chart

Topic Creation

In addition many enterprise kafka services do not allow automatic topic creation

You will need to manually create topics of the following form

BASE_TOPIC_NAME is the value used for topicURLRoot when configuring the egeria event bus. For example ‘egeria’

Cohort topics

For each cohort being used (such as cocoCohort):

OMAS Topics

These need to be done FOR EACH SERVER configured in the environment. (For example for coco pharmaceuticals this might include cocoMDS1, cocoMDS2, cocoMDS3 etc).

It also needs to be done FOR EACH OMAS configured (assetconsumer, dataplatform, governanceengine etc)

One way to configure is to initially run against a kafka service which allows auto topic creation, then make note of the kafka topics that have been created - so that they can be replicated on the restricted setup.

In addition review the Egeria Audit Log for any events beginning OCF-KAFKA_TOPIC_CONNECTOR so that action may be taken if for example topics are found to be missing.