For end to end integration testing, we would like to use an appraoch that allow us to boot a cluster, run our data pipeline which has several kafka stream application, collect the result, then shutdown the cluster.
In the past we have been able to automate that rather easily with an on-premise version of kafka that we were running on kubernetes. We also had the option to just scale down (without deleting the state) if we wanted to keep the data for a while before deleting it all, just in case we wanted to double check something. Combination of github action basically.
However my organization is pushing to expand the use of confluent cloud, hence i am exploring what are the option.
I could not find anything that would enable to do this kind of automation for two reason: the CLI would only allow to fully delete the cluster and i assume that include the data, and the provisining of dedicated cluster can take up to 24 hours it says, that is extremely slow, no way we can use that for automation.
Hence i am asking here, what are my options, or did i miss anything ?