Sink Connector Data Write Failure

Hi,

I am working on a project and trying to execute code but it’s showing this error- Sink Connector Data Write Failure.

ERROR: SinkWriteFailureException: Failed to write data to Elasticsearch.
Details:
- Index: orders
- Document ID: 12345
- Cause: Document rejected due to field mapping conflict.
  - Field: 'price'
  - Expected type: 'float'
  - Actual type: 'string'
Recommendations:
1. Review the Elasticsearch index mapping and ensure it matches the data schema.
2. Check the data transformation logic for type mismatches.
3. Validate incoming data to ensure it adheres to the expected types.

Why I am getting this error? When I search about this I cam across to this article Data Scientist vs Machine Learning and kafka jdbc sink connectors to mysql incorrect detect database as present · Issue #669 · confluentinc/kafka-connect-jdbc · GitHub. It suggests reviewing the Elasticsearch index mapping to ensure it matches the data schema, checking the data transformation logic for type mismatches, and validating incoming data to ensure it adheres to the expected types.

However, I fixed the mapping, data schema, but how can I set the data transformation logic for type mismatches.

Kindly help in this.

Thanks

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.