Connecting to AWS S3 (S3 sink)

Hi, I am brand new to Confluent Kafka & am trying to setup a POC to copy data from SAP using SAP BTP (DIC) and Confluent Kafka, and pushing the data to AWS S3

So far i have managed to flow data to Confluent Kafka and I see messages in a topic. I am now trying to use a S3 Sink connector, to connect this topic to S3, but I keep getting the message

The connector failed because the S3 bucket has been removed, renamed or is inaccessible. Please make sure the connector is configured to upload records to an existing S3 bucket.

I have checked the IAM policies setup in AWS and they all seem to be as per the documentation.

I have also used the AWS CLI on my machine using the same IAM details and I can push data to that bucket manually.

Any ideas what I have done wrong?

Thanks

Pete Gadsby

Hi Peter,
Using the AWS CLI to write data to a S3 bucket generally requires fewer IAM permissions compared to Confluent’s fully-managed S3 Sink Connector.
With the error message you provided, I’m confident that the root cause still lies with IAM permissions.
You will probably solve the issue with the IAM policy instructions from the documentation. :point_down:

Best,
Batu

Thanks Batu, I actually solved the problem today… In the S3 connector there is an entry called “Store URL” I assumed this was the URL for the bucket in question, as it shows on the page. However this apparently should be blank for this use case. Once I set it to blank it worked fine.

Thanks for responding though.

Best

Pete Gadsby

1 Like

Great @PeterGadsby :sunglasses:,
Thanks for sharing the solution