Kafka connector for Sybase and postgrase DB

Hi All,

I am new in this community. I have a requirement to read data using Kafka connector from our Sybase (legacy DB) to Postgrase (New DB).
Could you please guide me on few points?

1> Which source connector I will use (JDBC or any specific to Sybase)
2> Which sink connector I will use (JDBC or any specific to Postgres)
3> Please point to me a sample/reference example

Hi Mahesh, welcome to the community! It doesn’t look like there are any Sybase-specific source connectors, and though there is a Debezium source connector for Postgres, I don’t think there are any Postgres-specific sink connectors, so it looks like the JDBC connector will be your best bet for both.
Here’s the docs for that connector: https://docs.confluent.io/kafka-connect-jdbc/current/index.html
And here’s a couple of helpful blog posts from @rmoff:
Kafka Connect JDBC Sink - setting the key field name
Kafka Connect JDBC Sink deep-dive: Working with Primary Keys

Feel free to post back here if you run into problems, or just to let us all know how it’s going.

Thanks,
Dave

@daveklein : Thank you so much. This will help me. I will start my POC next week and get back in case any help is required.

Hi @daveklein : I have successfully started confluent Kafka 6.2.0 on the local system.
Also, I have installed Sybase ASE latest version. But when I create JDBC connector then it is in a running state but not task is created. In my connector logs, I checked it gives the below exception.

Json Config :

{
    "name": "jdbc_source_sybase_01",
    "config": {
        "connector.class": "io.confluent.connect.jdbc.JdbcSourceConnector",
        "connection.url": "jdbc:sybase:Tds:XXX.XXX.XXX.XXX:5000/pubs2",
        "connection.user": "sa",
        "connection.password": "XXXXXXX",
        "topic.prefix": "sybase-01-",
        "mode": "bulk"
    }
}

Exception

 ERROR Error while trying to get updated table list, ignoring and waiting for next table poll interval (io.confluent.connect.jdbc.source.TableMonitorThread:144)
com.sybase.jdbc4.jdbc.SybSQLException: '""' is not a valid type name.
        at com.sybase.jdbc4.tds.Tds.processEed(Tds.java:4117)
        at com.sybase.jdbc4.tds.Tds.nextResult(Tds.java:3207)
        at com.sybase.jdbc4.jdbc.ResultGetter.nextResult(ResultGetter.java:78)
        at com.sybase.jdbc4.jdbc.SybStatement.nextResult(SybStatement.java:293)
        at com.sybase.jdbc4.jdbc.SybStatement.nextResult(SybStatement.java:275)
        at com.sybase.jdbc4.jdbc.SybStatement.queryLoop(SybStatement.java:2642)
        at com.sybase.jdbc4.jdbc.SybCallableStatement.executeQuery(SybCallableStatement.java:151)
        at com.sybase.jdbc4.jdbc.SybDatabaseMetaData.returnResults(SybDatabaseMetaData.java:5490)
        at com.sybase.jdbc4.jdbc.SybDatabaseMetaData.getTables(SybDatabaseMetaData.java:3981)
        at io.confluent.connect.jdbc.dialect.GenericDatabaseDialect.tableIds(GenericDatabaseDialect.java:417)
        at io.confluent.connect.jdbc.source.TableMonitorThread.updateTables(TableMonitorThread.java:141)
        at io.confluent.connect.jdbc.source.TableMonitorThread.run(TableMonitorThread.java:76)
[2021-07-14 17:30:51,095] INFO Closing connection #1 to Generic (io.confluent.connect.jdbc.util.CachedConnectionProvider:105)
[2021-07-14 17:31:01,120] ERROR [Worker clientId=connect-1, groupId=connect-cluster] Failed to reconfigure connector's tasks (jdbc_source_sybase_011), retrying after backoff: (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1388)
org.apache.kafka.connect.errors.ConnectException: Tables could not be updated quickly enough.
        at io.confluent.connect.jdbc.source.TableMonitorThread.tables(TableMonitorThread.java:110)
        at io.confluent.connect.jdbc.JdbcSourceConnector.taskConfigs(JdbcSourceConnector.java:149)
        at org.apache.kafka.connect.runtime.Worker.connectorTaskConfigs(Worker.java:373)
        at org.apache.kafka.connect.runtime.distributed.DistributedHerder.reconfigureConnector(DistributedHerder.java:1432)
        at org.apache.kafka.connect.runtime.distributed.DistributedHerder.reconfigureConnectorTasksWithRetry(DistributedHerder.java:1379)
        at org.apache.kafka.connect.runtime.distributed.DistributedHerder.lambda$null$19(DistributedHerder.java:1338)
        at org.apache.kafka.connect.runtime.distributed.DistributedHerder.tick(DistributedHerder.java:398)
        at org.apache.kafka.connect.runtime.distributed.DistributedHerder.run(DistributedHerder.java:316)
        at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
        at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
        at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
        at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
        at java.base/java.lang.Thread.run(Thread.java:829)

Looks like you’ve posted this here JDBC Source connector for Sybase: com.sybase.jdbc4.jdbc.SybSQLException: ‘""’ is not a valid type name so I’ll close this thread now.