Document Kafka Broker commit offset interval (#4537)

* Document Kafka Broker commit offset interval

Signed-off-by: Pierangelo Di Pilato <pdipilat@redhat.com>

* Apply review suggestion

Signed-off-by: Pierangelo Di Pilato <pdipilat@redhat.com>
This commit is contained in:
Pierangelo Di Pilato 2021-12-13 16:19:20 +01:00 committed by GitHub
parent 48fa07945d
commit 5681e7e0dc
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
1 changed files with 35 additions and 0 deletions

View File

@ -227,6 +227,41 @@ kubectl create secret --namespace <namespace> generic <my_secret> \
!!! note !!! note
`ca.crt` can be omitted to fallback to use system's root CA set. `ca.crt` can be omitted to fallback to use system's root CA set.
## Consumer Offsets Commit Interval
Kafka consumers keep track of the last successfully sent events by committing offsets.
Knative Kafka Broker commits the offset every `auto.commit.interval.ms` milliseconds.
!!! note
To prevent negative impacts to performance, it is not recommended committing
offsets every time an event is successfully sent to a subscriber.
The interval can be changed by changing the `config-kafka-broker-data-plane` `ConfigMap`
in the `knative-eventing` namespace by modifying the parameter `auto.commit.interval.ms` as follows:
```yaml
apiVersion: v1
kind: ConfigMap
metadata:
name: config-kafka-broker-data-plane
namespace: knative-eventing
data:
# Some configurations omitted ...
config-kafka-broker-consumer.properties: |
# Some configurations omitted ...
# Commit the offset every 5000 millisecods (5 seconds)
auto.commit.interval.ms=5000
```
!!! note
Knative Kafka Broker guarantees at least once delivery, which means that your applications may
receive duplicate events. A higher commit interval means that there is a higher probability of
receiving duplicate events, because when a Consumer restarts, it restarts from the last
committed offset.
## Kafka Producer and Consumer configurations ## Kafka Producer and Consumer configurations
Knative exposes all available Kafka producer and consumer configurations that can be modified to suit your workloads. Knative exposes all available Kafka producer and consumer configurations that can be modified to suit your workloads.