最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

Kafka mirror maker 2 copy topic as avro not bytes - Stack Overflow

programmeradmin1浏览0评论

i have a kafka cluster in the US and i would like to replicate a topic onto an EU cluster. these 2 cluster share the same schema registry. i snapshot a table and can see the messages as avro on the NA server i can see the schema on the registry is correct. I then set up a mirror maker connector with the following config

{
    "connector.class": ".apache.kafka.connect.mirror.MirrorSourceConnector",
    "admin.timeout.ms": "300000",
    "offset-syncs.topic.replication.factor": "1",
    "errors.log.include.messages": "true",
    "producer.override.bootstrap.servers": "IP:9092",//same as the first IP in target.cluster.bootstrap.servers
    "topics": "test",
    "source.cluster.alias": "NA",
    "source.cluster.bootstrap.servers": "IP:9092",
    "target.cluster.alias": "",
    "target.cluster.bootstrap.servers": "IP:9092,IP:9092,IP:9092",
    "replication.policy.separator": "-",
    "errors.log.enable": "true",
    "value.converter": "io.confluent.connect.avro.AvroConverter",
    "key.converter": "io.confluent.connect.avro.AvroConverter",
    "key.converter.schema.registry.url": "http://IP:8085",
    "value.converter.schema.registry.url": "http://IP:8085"
} 

that creates a topic on the other end as NA-test, as expected my issue is then when i go to sink the data, the schema is not structured and only contains "bytes".

with this set up using the same schema registry, is it possible to transfer the data as avro ? i have tried writing to the target topic directly by setting the alias to blank and the separator. but that results in this error

Caused by: .apache.kafkamon.config.ConfigException: Failed to access Avro data from topic test : Schema being registered is incompatible with an earlier schema for subject "test-key"; error code: 409

does anyone have a solution so that the schemas match when using mirror maker ?

i have a kafka cluster in the US and i would like to replicate a topic onto an EU cluster. these 2 cluster share the same schema registry. i snapshot a table and can see the messages as avro on the NA server i can see the schema on the registry is correct. I then set up a mirror maker connector with the following config

{
    "connector.class": ".apache.kafka.connect.mirror.MirrorSourceConnector",
    "admin.timeout.ms": "300000",
    "offset-syncs.topic.replication.factor": "1",
    "errors.log.include.messages": "true",
    "producer.override.bootstrap.servers": "IP:9092",//same as the first IP in target.cluster.bootstrap.servers
    "topics": "test",
    "source.cluster.alias": "NA",
    "source.cluster.bootstrap.servers": "IP:9092",
    "target.cluster.alias": "",
    "target.cluster.bootstrap.servers": "IP:9092,IP:9092,IP:9092",
    "replication.policy.separator": "-",
    "errors.log.enable": "true",
    "value.converter": "io.confluent.connect.avro.AvroConverter",
    "key.converter": "io.confluent.connect.avro.AvroConverter",
    "key.converter.schema.registry.url": "http://IP:8085",
    "value.converter.schema.registry.url": "http://IP:8085"
} 

that creates a topic on the other end as NA-test, as expected my issue is then when i go to sink the data, the schema is not structured and only contains "bytes".

with this set up using the same schema registry, is it possible to transfer the data as avro ? i have tried writing to the target topic directly by setting the alias to blank and the separator. but that results in this error

Caused by: .apache.kafkamon.config.ConfigException: Failed to access Avro data from topic test : Schema being registered is incompatible with an earlier schema for subject "test-key"; error code: 409

does anyone have a solution so that the schemas match when using mirror maker ?

Share Improve this question asked Nov 20, 2024 at 17:07 jwolversonjwolverson 6710 bronze badges 1
  • if i change the avro convertor to .apache.kafka.connect.converters.ByteArrayConverter it stops making the schema in the registry and only makes the topic – jwolverson Commented Nov 25, 2024 at 16:19
Add a comment  | 

1 Answer 1

Reset to default 0

i found a solution for this swapping the avro convertor to .apache.kafka.connect.converters.ByteArrayConverter then setting "source.cluster.alias" and "replication.policy.separator" to "" allowed the messages to be written to the target topic in avro which allowed the sink connector to work

发布评论

评论列表(0)

  1. 暂无评论