Processing large CSV files using SFTP Source Kafka Connector

I am using stand alone setup to read big CSV files (50 MB to 500 MB) using SFTP source connector. The Kafka connector stops abruptly and needs restart to finish processing. If I split the big file to smaller files, the connector normally works fine.
Is there any limitations or thresholds for processing bigger files?
What is the best way to handle this case , will a distributed mode with some specific memory, CPU resources help?
Thank you.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.