Kafka connect oracle cdc

Hello,
I test oracle cdc I encountered an error

test schema registry curl --silent -X GET http://localhost:8081/subjects/ | jq .

ksql> CREATE SOURCE CONNECTOR `oemgc12-log-connector` WITH(
>"connector.class"= 'io.confluent.connect.oracle.cdc.OracleCdcSourceConnector',
>    "name"= 'oemgc12-log-connector',
>    "tasks.max"='1',
>    "key.converter"= 'io.confluent.connect.avro.AvroConverter',
>    "key.converter.schema.registry.url"= 'http=//localhost:8081',
>    "value.converter"= 'io.confluent.connect.avro.AvroConverter',
>    "value.converter.schema.registry.url"= 'http=//localhost:8081',
>    "confluent.topic.bootstrap.servers"='localhost:9092',
>    "oracle.server"= 'OEMGC12',
>    "oracle.port"= '1521',
>    "oracle.sid"='OEMGC12',
>    "oracle.username"= 'KAFKA_CONNECTOR',
>    "oracle.password"= 'XXXXXXXXXXXXXXXXXXXXXXXXXXX',
>    "start.from"='snapshot',
>    "redo.log.topic.name"= 'redo-log-topic-1',
>    "table.inclusion.regex"='OEMGC12.KAFKA_CONNECTOR.LOGS',
>    "_table.topic.name.template_"='Using template vars to set change event topic for each table',
>    "table.topic.name.template"='${databaseName}.${schemaName}.${tableName}',
>    "connection.pool.max.size"= '20',
>    "confluent.topic.replication.factor"='3',
>    "redo.log.row.fetch.size"='1',
>    "numeric.mapping"='best_fit',
>    "topic.creation.groups"= 'redo',
>    "topic.creation.redo.include"= 'redo-log-topic',
>    "topic.creation.redo.replication.factor"= '3',
>    "topic.creation.redo.partitions"= '1',
>    "topic.creation.redo.cleanup.policy"= 'delete',
>    "topic.creation.redo.retention.ms"= '1209600000',
>    "topic.creation.default.replication.factor"= '3',
>    "topic.creation.default.partitions"= '5',
>    "topic.creation.default.cleanup.policy"= 'compact'
>);
2021-12-06 10:55:08,544] ERROR WorkerSourceTask{id=oemgc12-log-connector-0} Task threw an uncaught and unrecoverable exception. Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask:190)
org.apache.kafka.connect.errors.ConnectException: Tolerance exceeded in error handler
        at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:206)
        at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execute(RetryWithToleranceOperator.java:132)
        at org.apache.kafka.connect.runtime.WorkerSourceTask.convertTransformedRecord(WorkerSourceTask.java:321)
        at org.apache.kafka.connect.runtime.WorkerSourceTask.sendRecords(WorkerSourceTask.java:347)
        at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:261)
        at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:188)
        at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:237)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
**Caused by: org.apache.kafka.connect.errors.DataException: Failed to serialize Avro data from topic OEMGC12.KAFKA_CONNECTOR.LOGS :**
        at io.confluent.connect.avro.AvroConverter.fromConnectData(AvroConverter.java:93)
        at org.apache.kafka.connect.storage.Converter.fromConnectData(Converter.java:63)
        at org.apache.kafka.connect.runtime.WorkerSourceTask.lambda$convertTransformedRecord$3(WorkerSourceTask.java:321)
        at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndRetry(RetryWithToleranceOperator.java:156)
        at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:190)
        ... 11 more
Caused by: org.apache.kafka.common.errors.SerializationException: Error serializing Avro message
Caused by: java.net.MalformedURLException: no protocol: http=//localhost:8081/subjects/OEMGC12.KAFKA_CONNECTOR.LOGS-value/versions
        at java.net.URL.<init>(URL.java:611)
        at java.net.URL.<init>(URL.java:508)
        at java.net.URL.<init>(URL.java:457)
        at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:260)
        at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:365)
        at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:508)
        at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:499)
        at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:472)
        at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.registerAndGetId(CachedSchemaRegistryClient.java:213)
        at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:275)
        at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:251)
        at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:103)
        at io.confluent.connect.avro.AvroConverter$Serializer.serialize(AvroConverter.java:153)
        at io.confluent.connect.avro.AvroConverter.fromConnectData(AvroConverter.java:86)
        at org.apache.kafka.connect.storage.Converter.fromConnectData(Converter.java:63)
        at org.apache.kafka.connect.runtime.WorkerSourceTask.lambda$convertTransformedRecord$3(WorkerSourceTask.java:321)
        at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndRetry(RetryWithToleranceOperator.java:156)
        at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:190)
        at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execute(RetryWithToleranceOperator.java:132)
        at org.apache.kafka.connect.runtime.WorkerSourceTask.convertTransformedRecord(WorkerSourceTask.java:321)
        at org.apache.kafka.connect.runtime.WorkerSourceTask.sendRecords(WorkerSourceTask.java:347)
        at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:261)
        at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:188)
        at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:237)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
[2021-12-06 10:55:08,545] INFO Stopping the 'oemgc12-log-connector' Oracle CDC connector task 0 (io.confluent.connect.oracle.cdc.OracleCdcSourceTask:582)
[2021-12-06 10:55:08,546] ERROR SQL exception:  (io.confluent.connect.oracle.cdc.logging.LogUtils:22)
java.sql.SQLRecoverableException: Closed Connection
        at oracle.jdbc.driver.PhysicalConnection.needLine(PhysicalConnection.java:3525)
        at oracle.jdbc.driver.OracleStatement.closeOrCache(OracleStatement.java:1478)
        at oracle.jdbc.driver.OracleStatement.close(OracleStatement.java:1461)
        at oracle.jdbc.driver.OracleStatementWrapper.close(OracleStatementWrapper.java:122)
        at oracle.ucp.jdbc.proxy.oracle.StatementProxy.close(StatementProxy.java:145)
        at oracle.ucp.jdbc.proxy.oracle$1ucp$1jdbc$1proxy$1oracle$1StatementProxy$2oracle$1jdbc$1internal$1OracleStatement$$$Proxy.close(Unknown Source)
        at oracle.ucp.jdbc.proxy.oracle.ConnectionProxyBase.handleSQRecoverableException(ConnectionProxyBase.java:94)
        at oracle.ucp.jdbc.proxy.oracle.StatementProxy.onError(StatementProxy.java:255)
        at oracle.ucp.jdbc.proxy.oracle$1ucp$1jdbc$1proxy$1oracle$1StatementProxy$2oracle$1jdbc$1internal$1OracleStatement$$$Proxy.executeQuery(Unknown Source)
        at io.confluent.connect.oracle.cdc.logging.LogUtils.executeQuery(LogUtils.java:20)
        at io.confluent.connect.oracle.cdc.mining.WithContinuousMining.mineRedoLogs(WithContinuousMining.java:80)
        at io.confluent.connect.oracle.cdc.OracleRedoLogReader.lambda$readRedoLogs$0(OracleRedoLogReader.java:93)
        at io.confluent.connect.utils.retry.RetryPolicy.callWith(RetryPolicy.java:417)
        at io.confluent.connect.utils.retry.RetryPolicy.callWith(RetryPolicy.java:368)
        at io.confluent.connect.oracle.cdc.OracleDatabase.retry(OracleDatabase.java:553)
        at io.confluent.connect.oracle.cdc.OracleRedoLogReader.readRedoLogs(OracleRedoLogReader.java:91)
        at io.confluent.connect.oracle.cdc.util.RecordQueue.lambda$createLoggingSupplier$0(RecordQueue.java:465)
        at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1604)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
[2021-12-06 10:55:08,976] INFO Start LogMiner CONTINUOUS_MINE session at 113222311256 in mode ONLINE (io.confluent.connect.oracle.cdc.mining.WithContinuousMining:60)
[2021-12-06 10:55:13,232] INFO WorkerSourceTask{id=oemgc12-log-connector-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:510)
[2021-12-06 10:55:18,546] ERROR SQL exception:  (io.confluent.connect.oracle.cdc.logging.LogUtils:22)
java.sql.SQLRecoverableException: Closed Connection
        at oracle.jdbc.driver.PhysicalConnection.needLine(PhysicalConnection.java:3525)
        at oracle.jdbc.driver.OracleStatement.closeOrCache(OracleStatement.java:1478)
        at oracle.jdbc.driver.OracleStatement.close(OracleStatement.java:1461)
        at oracle.jdbc.driver.OracleStatementWrapper.close(OracleStatementWrapper.java:122)
        at oracle.ucp.jdbc.proxy.oracle.StatementProxy.close(StatementProxy.java:145)
        at oracle.ucp.jdbc.proxy.oracle$1ucp$1jdbc$1proxy$1oracle$1StatementProxy$2oracle$1jdbc$1internal$1OracleStatement$$$Proxy.close(Unknown Source)
        at oracle.ucp.jdbc.proxy.oracle.ConnectionProxyBase.handleSQRecoverableException(ConnectionProxyBase.java:94)
        at oracle.ucp.jdbc.proxy.oracle.StatementProxy.onError(StatementProxy.java:255)
        at oracle.ucp.jdbc.proxy.oracle$1ucp$1jdbc$1proxy$1oracle$1StatementProxy$2oracle$1jdbc$1internal$1OracleStatement$$$Proxy.executeQuery(Unknown Source)
        at io.confluent.connect.oracle.cdc.logging.LogUtils.executeQuery(LogUtils.java:20)
        at io.confluent.connect.oracle.cdc.mining.WithContinuousMining.mineRedoLogs(WithContinuousMining.java:80)
        at io.confluent.connect.oracle.cdc.OracleRedoLogReader.lambda$readRedoLogs$0(OracleRedoLogReader.java:93)
        at io.confluent.connect.utils.retry.RetryPolicy.callWith(RetryPolicy.java:417)
        at io.confluent.connect.utils.retry.RetryPolicy.callWith(RetryPolicy.java:368)
        at io.confluent.connect.oracle.cdc.OracleDatabase.retry(OracleDatabase.java:553)
        at io.confluent.connect.oracle.cdc.OracleRedoLogReader.readRedoLogs(OracleRedoLogReader.java:91)
        at io.confluent.connect.oracle.cdc.util.RecordQueue.lambda$createLoggingSupplier$0(RecordQueue.java:465)
        at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1604)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
[2021-12-06 10:55:18,546] INFO Stopped the 'oemgc12-log-connector' Oracle CDC connector task 0 (io.confluent.connect.oracle.cdc.OracleCdcSourceTask:622)
[2021-12-06 10:55:18,547] INFO [Producer clientId=connector-producer-oemgc12-log-connector-0] Closing the Kafka producer with timeoutMillis = 30000 ms. (org.apache.kafka.clients.producer.KafkaProducer:1204)
[2021-12-06 10:55:18,549] INFO Metrics scheduler closed (org.apache.kafka.common.metrics.Metrics:659)
[2021-12-06 10:55:18,549] INFO Closing reporter org.apache.kafka.common.metrics.JmxReporter (org.apache.kafka.common.metrics.Metrics:663)
[2021-12-06 10:55:18,549] INFO Metrics reporters closed (org.apache.kafka.common.metrics.Metrics:669)
[2021-12-06 10:55:18,549] INFO App info kafka.producer for connector-producer-oemgc12-log-connector-0 unregistered (org.apache.kafka.common.utils.AppInfoParser:83)
[2021-12-06 10:55:18,550] INFO App info kafka.admin.client for connector-adminclient-oemgc12-log-connector-0 unregistered (org.apache.kafka.common.utils.AppInfoParser:83)
[2021-12-06 10:55:18,552] INFO Metrics scheduler closed (org.apache.kafka.common.metrics.Metrics:659)
[2021-12-06 10:55:18,552] INFO Closing reporter org.apache.kafka.common.metrics.JmxReporter (org.apache.kafka.common.metrics.Metrics:663)
**strong text**

Thanks for your help

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.