There’s a new Streaming Audio episode - check it out!
How do you use Apache Kafka®, Confluent Platform, and Confluent Cloud for DevOps? Integration Architects Rick Spurgeon and Allison Walther share how, including a custom tool they’ve developed for this very purpose.
First, Rick and Allison share their perspective of what it means to be a DevOps engineer. Mixing development and operations skills to deploy, manage, monitor, audit, and maintain distributed systems. DevOps is multifaceted and can be compared to glue, in which you’re stitching software, services, databases, Kafka, and more, together to integrate end to end solutions.
Using the Confluent Cloud Metrics API (actionable operational metrics), you pull a wide range of metrics about your cluster, a topic or partition, bytes, records, and requests. The Metrics API is unique in that it is queryable. You can send this API question, “What's the max retained bytes per hour over 10 hours for my topic or my cluster?” and find out just like that.
To make writing operators much easier, Rick and Allison also share about Crossplane, KUDO, Shell-operator, and how to use these tools.
EPISODE LINKS
- Confluent Cloud Metrics API
- Shell Operator
- kafka-devops
- The Kubernetes Universal Declarative Operator
- Introducing the AWS Controllers for Kubernetes (ACK)
- Manage any infrastructure your applications need directly from Kubernetes with Crossplane
- Apache Kafka DevOps with Kubernetes and GitOps
- Spring Your Microservices into Production with Kubernetes and GitOps
- Join the Confluent Community Slack
- Learn more with Kafka tutorials, resources, and guides at Confluent Developer
- Live demo: Kafka streaming in 10 minutes on Confluent Cloud
- Use 60PDCAST to get an additional $60 of free Confluent Cloud usage (details)