I am completely new to world of Kafka. I understand the basic principals and I can clearly see the potential of this platform.
However, I am not a developer and know nothing about about Java.
I explore this field using Docker and ChatGPT/Copilot (yes… I am one of “them”).
What I want to do is to explore this using Docker Desktop on my own computer to start with.
As a start I thought I would try to implement a connector that pulls data from an public API source with frequently updated data, such as a weather or finance API.
But right out the gate I am completely stuck.
When I go to the Confluent Hub then it seems as if the HTTP source connector (if that is what I am supposed to work with?) is not available.
Copilot doesn’t seem to understand that so it keeps on adding it to the Dockerfile and then failing.
After a while it tries other solutions on GitHub instead but fails there as well as the ZIP file or the JAR files are no longer available OR it just points to a repo and then fail.
Long story short… is there an “easy” way to get a connector or Docker image for this purpose which is free?
I seem to be running around in circles here…
Disclaimer: The linked cp-all-in-one demo runs Confluent Platform’s Enterprise licensed platform product. There is a “forever” Developer license for running one (non-production) broker, which is what this demo does.
Many thanks for your response!
That was exactly what I was looking for.
I am very greatful for that.
I was thinking to connect to the Nasdaq API and get the share prices as that should produce a lot of new data on a regular basis.
But there was quite a lot to figure out for the HTTP connector and not super intuitive what to put in where.
The URL is: https://data.nasdaq.com/api/v3/datasets/WIKI/PRICES.json
I do have an API key and Copilot can give me the content of the configuration file.
However, when I click my way in to the GUI, I cannot see anywhere to paste in the full configuration. Instead each setting seems to have its own field and it is not obvious what value in the configuration maps to which field in the GUI.
The more I poke around the more questions pops up…
Can parts of the request URL be dynamically set?
For instance, assume that I have a message in a topic for each symbol on Nasdaq, and I need to append that to the request URL for API calls on each specific company.
I guess I could create a static HTTP connector for each company, but that seems like a very ineffective way to do it…
Are you using this connector and not he v2 one? The entity.names parameter with values that then get templated into ${entityName} might work for some number of companies. I doubt that it will scale to all NASDAQ listings, though. And this would require tickers in the connector config, not coming from a topic. If the pattern of consuming from a topic and using data in the events to call out to another API is required, then I don’t think that the HTTP source connector is the way to go. The NASDAQ API (including docs) is pretty closed down without an account so I haven’t been able to dig into the API to see how well it fits with the HTTP Source connector.