ksqlDB create key field issue in docker image

i have the below record for example in my topic. :
{“MSGTYPE”:“1442”,“FILEID”:“00122”,“FCODE”:“450”} {“MSGTYPE”:“1442”,“PDATE”:{“string”:“2020-02-19T00:00:00”},“FILEID”:“00122”,“STATUS”:{“string”:“0”},“FCODE”:“450”}

Key and value ,
i want to make sink connector to send this topic to oracle.
so as this is plain json i used ksqlDB to pre process the topic to avro schema and use the new topic in the sink connector.

Note: my environment is strimzi kafka on openshift.

i pulled images for ksql server and ksql cli.
confluentinc/cp-ksql-server:latest
confluentinc/cp-ksql-cli:latest

and when I’m trying to create the below stream to identify the keys, I got errors, and I can’t identify any keys in my stream.

CREATE STREAM test-cortex (MSGTYPE VARCHAR KEY, FILEID VARCHAR KEY, FCODE VARCHAR KEY, PDATE VARCHAR, STATUS VARCHAR, COL2 INT, COL3 VARCHAR)
  WITH (KAFKA_TOPIC='CORTEX', VALUE_FORMAT='AVRO', KEY_FORMAT='KAFKA');

Errors:
KEY: KSQL currently only supports KEY columns named ROWKEY,
PRIMARY KEY: extraneous input ‘PRIMARY.’

when I’m trying to use other images like:

confluentinc/ksqldb-server:
I got this error:

The server has encountered an incompatible entry in its log and cannot process further DDL statements.
This is most likely due to the service being rolled back to an earlier version.

so, I’m struggling to deal with keys in the topic and I have to go live soon so please kindly your support. Thanks in advance.

This topic was automatically closed after 30 days. New replies are no longer allowed.