Kafka-connect-transform-cobol

HI
I am not able to find any documentation on the [kafka-connect-transform-cobol
connector. I am looking for more information on the connector plugins and the prerequisite for building the connector that will help me.
If any one has a sample code who has used the connector and provide guidance on the use case that will be great.

hey @pjahagir

never used by myself
just discovered the documentation links

https://jcustenborder.github.io/kafka-connect-documentation/projects/kafka-connect-transform-cobol/transformations/examples/FromCopybook.Simple.html

https://jcustenborder.github.io/kafka-connect-documentation/projects/kafka-connect-transform-cobol/transformations/FromCopybook.html

best,
michael

hi Michael,
Thanks for the prompt response. I will take a look at it.

Regards
Pradeep

I don’t see much details on how to configure the connector. The connector class name is also missing. It is challenging to build a connector . Need more info and the documentation page also says this connector is not tested.

ok I see,need to test it by myself
maybe one step back:

what would you like to achieve?
how does your source data look like?

there is a nice blog post by @KaiWaehner regarding mainframe integration

best,
michael

Thank you Michael. The blog is quite informative. I will take a deep look into it.
I also work for large Bank and currently replacing an application which gets mainframe data about 44 million transactions a day.
I have implemented the IBM MQ connector which ingests the mainframe data and is fixed width. The consumer application is lookijng for data to be in JSON format and hence i have a task of converting this fixed width data to JSON format. The structure of the data is in COBOL Copy book.
Hence i was looking for something like converting a COBOL copy book to POJO (Plain Old Java Object) which can be converted to a Java class and then used in Kafka stream.

Regards
Pradeep

you’ re welcome :slight_smile:

just to be sure:
you’ re already running the mq source connector and you are able to send these messages to a kafka topic?

best,
michael

Yes the MQ connector is working fine . Only thing is it is dumping the data in fixed width format

Regards
Pradeep

ok which one are you using?
the one provided by confluent?

best,
michael

Yes the IBMMQSourceConnector
This connector ingests theQ messages as a text field which is having field width separated within .
Hence the value format it seems is like Kafka_String
The consumer app is looking for JSON format . To convert from the fixed width to JSON had to build a KSTREAM In Java and then convert to JSON format
I thought there would be some easy way to do this like a SMTP or KSQL would do
But did not find anything which would help.
Do you have any suggestions to tackle this ?

Regards
Pradeep

ok I see.
coud you elaborate a bit more about

it may be possible with an SMT
have a look at

and

best,
michael

fyi:
with confluent platform 7.3 there are a new
Certified IBM MQ source and sink Premium Connectors for z/OS available

see

and
https://docs.confluent.io/platform/7.3.0/connect/connect-zos.html#certified-connectors-on-z-os

for details

best,
michael