Fluentd not consuming from kafka

Hi, I have a EFK stack on k8s. There is a dumy app that generates logs and I collect the logs via fluent Bit and put in on a single node kafka, on the other side there is a deamon set of fluentd that consume the logs from the topic and insert it on elasticsearch. so far everything is fine. I came up with a scenario that fluentd is crashed. when I delete the replicaset from k8s and create it again after a while, I expect that fluentd starts to consume the logs from topic where it left it. But it ignore it and start to insert new logs to elasticseach and a now there is a gap in elasticseach. I checked the time on logs but the gap is not in them.

This is my config files:
fluent-bit:

    [INPUT]
        Name              tail
        Tag               docker-log-gen.*
        Path              /var/log/containers/docker-log-generator*.log
        Parser            docker
        DB                /var/log/flb_kube.db
        Mem_Buf_Limit     100MB
        Skip_Long_Lines   On
        Refresh_Interval  10

    [OUTPUT]
        Name            kafka
        Match           docker-log-gen.*
        Brokers         kafka-service:9092
        Logstash_Format Off
        topics          fluentbit-docker-log-gen
        Replace_Dots    On
        Retry_Limit     False

fluentd:

<source>
   @type  kafka
   @log_level info
   brokers  kafka-service
   topics fluentbit-docker-log-gen
   time_key time
</source>
<match *>
   @type elasticsearch
   time_key time
   ....
</match>

I will be very appreciate if you can help with that.
Thanx a lot

This sounds like more of a Fluentd question than something the Confluent community can necessarily help with.

Have you considered using Kafka Connect to stream the data to Elasticsearch instead? I’m not familiar with fluentd so perhaps this suggestion is redundant, but Kafka Connect supports delivery guarentees, restarts, etc etc so it may be useful here.