How to increase the partition of existing topic while using S3 sink connector

I am using S3 sink connector along with kafka connect and trying to load data to s3 . somehow I am unable to update or increase the size of topic partition and also cannot change offset.flush.timeout.ms value. I am trying to add that in the S3 connector curl file which I am using but nothing updates.

{
    "name": "my-s3-sink3",
     "config": {
         "connector.class"               : "io.confluent.connect.s3.S3SinkConnector",
         "tasks.max"                     : "1",
         "topics"                        : "mysource.topic",
         "s3.region"                     : "us-east-1",
         "s3.bucket.name"                : "topicbucket001",
         "s3.part.size"                  : "5242880",
         "flush.size"                    : "1",
         "key.converter"                 : "org.apache.kafka.connect.storage.StringConverter",
         "value.converter"               : "org.apache.kafka.connect.json.JsonConverter",
         "value.converter.schemas.enable": "false",
         "storage.class"                 : "io.confluent.connect.s3.storage.S3Storage",
         "format.class"                  : "io.confluent.connect.s3.format.json.JsonFormat",
         "partitioner.class"             : "io.confluent.connect.storage.partitioner.DefaultPartitioner",
         "schema.compatibility"          : "NONE"
         "offset.flust.timeout.ms"        : 1000
         "topic.creation.default.replication.factor": 3,
         "topic.creation.default.partitions": 10,
         "topic.creation.default.compression.type": "snappy"
        }
    }

@Kanikamiglani31 can you share the curl command you’re running to update the config? Also for clarity, when you say “increase the size of topic partition”, you are talking about increasing the number of partitions in Kafka itself using kafka-topics command?

Hey rick,
Thanks for the response here . I am using curl -X POST http://localhost:8083/connectors -H 'Content-Type:application/json' -H 'Accept:application/json' -d "{The json mentioned in above ticket }

Increasing number of partitions for existing topics I am creating in kafka (I am using AWS MSK for kafka though) but I am creating topics with above config using s3 sink connector.
Even if I can create partitions for new topics topics , I am good with that .

Note : I am enabled auto.create.topic=true in my kafka setup , so I am just passing a name of topic in my s3 sink connector config and it is automatically creating topics.

Thanks
Kanika

Kanika- Check out the Kafka Connect REST API docs, in particular the PUT command for an existing connector’s configuration.

https://docs.confluent.io/platform/current/connect/references/restapi.html#put--connectors-(string-name)-config

Heyy Rick,

I don’t see anywhere where it creates partitions for the topics are there any config for that ?

Thanks
Kanika