hello
i try to run a simple test with kafka using docker.
Here is my docker-compose:
version: "3.2"
services:
zookeeper:
image: wurstmeister/zookeeper
ports:
- "2181"
kafka:
image: wurstmeister/kafka
depends_on:
- zookeeper
ports:
- "9092-9094:9092"
environment:
KAFKA_ADVERTISED_HOST_NAME: 0.0.0.0
KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
volumes:
- /var/run/docker.sock:/var/run/docker.sock
kafkaui:
image: provectuslabs/kafka-ui
depends_on:
- kafka
- zookeeper
ports:
- "8080:8080"
environment:
KAFKA_CLUSTERS_0_NAME: local
KAFKA_CLUSTERS_0_BOOTSTRAPSERVERS: "kafka:9092"
As you see, for KAFKA_ADVERTISED_HOST_NAME, I used my ip : 0.0.0.0
Then I do docker-compose up -d
and everything is OK
then I do
docker-compose up --scale kafka=3 -d
to have 3 nodes and it is alos OK when I check with docker ps -a
: I see all my containers.
on my browser i go to 0.0.0.0:8080 and i see the user interface.
I can see the cluster which is online, but I can’t have access to the menu “broker”" as if it was offline. It that weird?
I try to run the python script:
from kafka import KafkaProducer
kafka_producer = KafkaProducer(bootstrap_servers="localhost:9092")
for i in range(1, 4):
kafka_producer.send(topic="test", value=f"New message # {i}".encode("utf-8"))
kafka_producer.flush()
I have something like a time out error
on the user interface the cluser if offline but all my containers are still running. So I don’t know what is wrong.
I have a weak and old computer : core i3 and 4GB RAM. is my computer not powerful enough?