I am trying to use StreamBridge to publish messages to GCP pubsub in CloudEvents format and use Avro for schema validation. Here is the message building snippet
Message<Person> ce = CloudEventMessageBuilder
.withData(person)
.setId(UUID.randomUUID().toString())
.setSource(URI.create("http://localhost"))
.setType("person_add")
.setDataContentType("application/cloudevents+json")
.setHeader(GcpPubSubHeaders.ORDERING_KEY, "personKey")
.build();
The headers for the received message look like this:
{ "ce_datacontenttype": "application/cloudevents+json", "ce_id": "aeb44aa1-524a-4032-b4fc-8013428c214a", "ce_source": "http://localhost", "ce_specversion": "1.0", "ce_type": "person_add", "contentType": "application/json", "message-type": "cloudevent", "target-protocol": "kafka" }
Notice the added, non CloudEvents headers, contentType, message-type and target-protocol.
I tried to develop an Avro schema for this, but GCP rejected the schema definition because hypens "-" are not valid in Avro names.
Debugging the code I discovered that CloudEventMessageBuilder.build() is post processing the message before publish and changing any headers with "ce-" to "ce_", should it also be converting these added headers as well? Is there a way to stop these additional headers being added?