HTTP Sink Connector Batching not working

I am using below HTTP Sink connector config and it is still sending records one by one. It is supposed to send data in a batch of 50 messages.

{
	"name": "HTTPSinkConnector_1",
	"config": {
		"topics": "topic_1",
		"tasks.max": "1",
		"connector.class": "io.confluent.connect.http.HttpSinkConnector",
		"http.api.url": "http://localhost/messageHandler",
		"request.method": "POST",
		"key.converter":"org.apache.kafka.connect.storage.StringConverter",
		"value.converter": "io.confluent.connect.avro.AvroConverter",
		"value.converter.schema.registry.url": "http://schema-registry:8081",
		"confluent.topic.bootstrap.servers": "kafka:19092",
		"confluent.topic.replication.factor": "1",
		"batching.enabled": true,
		"batch.max.size": 50,
		"reporter.bootstrap.servers": "kafka:19092",
		"reporter.result.topic.name": "success-responses",
		"reporter.result.topic.replication.factor": "1",
		"reporter.error.topic.name": "error-responses",
		"reporter.error.topic.replication.factor": "1",
		"request.body.format": "json"
	}
}

Could someone please suggest if any another property is missing.

Have you checked your message headers and their values? As per the docs:

The HTTP Sink connector does not batch requests for messages containing Kafka header values that are different.

Yes, It is due to different header values.
Is there any way to drop the headers and make the batching working?

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.