Kafka Deployment automation

For end to end integration testing, we would like to use an appraoch that allow us to boot a cluster, run our data pipeline which has several kafka stream application, collect the result, then shutdown the cluster.

In the past we have been able to automate that rather easily with an on-premise version of kafka that we were running on kubernetes. We also had the option to just scale down (without deleting the state) if we wanted to keep the data for a while before deleting it all, just in case we wanted to double check something. Combination of github action basically.

However my organization is pushing to expand the use of confluent cloud, hence i am exploring what are the option.

I could not find anything that would enable to do this kind of automation for two reason: the CLI would only allow to fully delete the cluster and i assume that include the data, and the provisining of dedicated cluster can take up to 24 hours it says, that is extremely slow, no way we can use that for automation.

Hence i am asking here, what are my options, or did i miss anything ?

Consider using Confluent Cloud’s API to get more control over your clusters. You might be able to script some of the tasks to automate the process better. Additionally, look into infrastructure-as-code tools like Terraform, which can help manage the lifecycle of your clusters more efficiently.
To manage your workflows and track progress, you could also use business tools like sugar customer relationship management. These tools offer features for tracking customer interactions and other business processes that might be helpful.