I am new in this community. I have a requirement to read data using Kafka connector from our Sybase (legacy DB) to Postgrase (New DB).
Could you please guide me on few points?
1> Which source connector I will use (JDBC or any specific to Sybase)
2> Which sink connector I will use (JDBC or any specific to Postgres)
3> Please point to me a sample/reference example
Hi @daveklein : I have successfully started confluent Kafka 6.2.0 on the local system.
Also, I have installed Sybase ASE latest version. But when I create JDBC connector then it is in a running state but not task is created. In my connector logs, I checked it gives the below exception.
ERROR Error while trying to get updated table list, ignoring and waiting for next table poll interval (io.confluent.connect.jdbc.source.TableMonitorThread:144)
com.sybase.jdbc4.jdbc.SybSQLException: '""' is not a valid type name.
at com.sybase.jdbc4.tds.Tds.processEed(Tds.java:4117)
at com.sybase.jdbc4.tds.Tds.nextResult(Tds.java:3207)
at com.sybase.jdbc4.jdbc.ResultGetter.nextResult(ResultGetter.java:78)
at com.sybase.jdbc4.jdbc.SybStatement.nextResult(SybStatement.java:293)
at com.sybase.jdbc4.jdbc.SybStatement.nextResult(SybStatement.java:275)
at com.sybase.jdbc4.jdbc.SybStatement.queryLoop(SybStatement.java:2642)
at com.sybase.jdbc4.jdbc.SybCallableStatement.executeQuery(SybCallableStatement.java:151)
at com.sybase.jdbc4.jdbc.SybDatabaseMetaData.returnResults(SybDatabaseMetaData.java:5490)
at com.sybase.jdbc4.jdbc.SybDatabaseMetaData.getTables(SybDatabaseMetaData.java:3981)
at io.confluent.connect.jdbc.dialect.GenericDatabaseDialect.tableIds(GenericDatabaseDialect.java:417)
at io.confluent.connect.jdbc.source.TableMonitorThread.updateTables(TableMonitorThread.java:141)
at io.confluent.connect.jdbc.source.TableMonitorThread.run(TableMonitorThread.java:76)
[2021-07-14 17:30:51,095] INFO Closing connection #1 to Generic (io.confluent.connect.jdbc.util.CachedConnectionProvider:105)
[2021-07-14 17:31:01,120] ERROR [Worker clientId=connect-1, groupId=connect-cluster] Failed to reconfigure connector's tasks (jdbc_source_sybase_011), retrying after backoff: (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1388)
org.apache.kafka.connect.errors.ConnectException: Tables could not be updated quickly enough.
at io.confluent.connect.jdbc.source.TableMonitorThread.tables(TableMonitorThread.java:110)
at io.confluent.connect.jdbc.JdbcSourceConnector.taskConfigs(JdbcSourceConnector.java:149)
at org.apache.kafka.connect.runtime.Worker.connectorTaskConfigs(Worker.java:373)
at org.apache.kafka.connect.runtime.distributed.DistributedHerder.reconfigureConnector(DistributedHerder.java:1432)
at org.apache.kafka.connect.runtime.distributed.DistributedHerder.reconfigureConnectorTasksWithRetry(DistributedHerder.java:1379)
at org.apache.kafka.connect.runtime.distributed.DistributedHerder.lambda$null$19(DistributedHerder.java:1338)
at org.apache.kafka.connect.runtime.distributed.DistributedHerder.tick(DistributedHerder.java:398)
at org.apache.kafka.connect.runtime.distributed.DistributedHerder.run(DistributedHerder.java:316)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:829)