docs/howto/setup-pub-sub-message-broker/setup-kafka.md

1.7 KiB

Setup Kafka

Locally

You can run Kafka locally using this Docker image. To run without Docker, see the getting started guide here.

Kubernetes

To run Kafka on Kubernetes, you can use the Helm Chart.

Create a Dapr component

The next step is to create a Dapr component for Kafka.

Create the following YAML file named kafka.yaml:

apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
  name: <NAME>
  namespace: <NAMESPACE>
spec:
  type: pubsub.kafka
  metadata:
      # Kafka broker connection setting
    - name: brokers
      # Comma separated list of kafka brokers
      value: "dapr-kafka.dapr-tests.svc.cluster.local:9092"
      # Enable auth. Default is "false"
    - name: authRequired
      value: "false"
      # Only available is authRequired is set to true
    - name: saslUsername
      value: <username>
      # Only available is authRequired is set to true
    - name: saslPassword
      value: <password>

The above example uses secrets as plain strings. It is recommended to use a secret store for the secrets as described here.

Apply the configuration

In Kubernetes

To apply the Kafka component to Kubernetes, use the kubectl:

kubectl apply -f kafka.yaml

Running locally

The Dapr CLI will automatically create a directory named components in your current working directory with a Redis component. To use Kafka, replace the pubsub.yaml (or messagebus.yaml for Dapr < 0.6.0) file with the kafka.yaml above.