diff --git a/daprdocs/content/en/reference/components-reference/supported-pubsub/setup-apache-kafka.md b/daprdocs/content/en/reference/components-reference/supported-pubsub/setup-apache-kafka.md index e0e4167c9..616bd31d6 100644 --- a/daprdocs/content/en/reference/components-reference/supported-pubsub/setup-apache-kafka.md +++ b/daprdocs/content/en/reference/components-reference/supported-pubsub/setup-apache-kafka.md @@ -27,11 +27,11 @@ spec: value: "group1" - name: clientID # Optional. Used as client tracing ID by Kafka brokers. value: "my-dapr-app-id" - - name: authRequired # Required. - value: "true" - - name: saslUsername # Required if authRequired is `true`. + - name: authType # Required. + value: "password" + - name: saslUsername # Required if authType is `password`. value: "adminuser" - - name: saslPassword # Required if authRequired is `true`. + - name: saslPassword # Required if authType is `password`. secretKeyRef: name: kafka-secrets key: saslPasswordSecret @@ -50,22 +50,158 @@ spec: | brokers | Y | A comma-separated list of Kafka brokers. | `"localhost:9092,dapr-kafka.myapp.svc.cluster.local:9093"` | consumerGroup | N | A kafka consumer group to listen on. Each record published to a topic is delivered to one consumer within each consumer group subscribed to the topic. | `"group1"` | clientID | N | A user-provided string sent with every request to the Kafka brokers for logging, debugging, and auditing purposes. Defaults to `"sarama"`. | `"my-dapr-app"` -| authRequired | Y | Enable [SASL](https://en.wikipedia.org/wiki/Simple_Authentication_and_Security_Layer) authentication with the Kafka brokers. | `"true"`, `"false"` -| saslUsername | N | The SASL username used for authentication. Only required if `authRequired` is set to `"true"`. | `"adminuser"` -| saslPassword | N | The SASL password used for authentication. Can be `secretKeyRef` to use a [secret reference]({{< ref component-secrets.md >}}). Only required if `authRequired` is set to `"true"`. | `""`, `"KeFg23!"` +| authRequired | N | *Deprecated* Enable [SASL](https://en.wikipedia.org/wiki/Simple_Authentication_and_Security_Layer) authentication with the Kafka brokers. | `"true"`, `"false"` +| authType | Y | Configure or disable authentication. Supported values: `none`, `password`, `mtls`, or `oidc` | `"password"`, `"none"` +| saslUsername | N | The SASL username used for authentication. Only required if `authType` is set to `"password"`. | `"adminuser"` +| saslPassword | N | The SASL password used for authentication. Can be `secretKeyRef` to use a [secret reference]({{< ref component-secrets.md >}}). Only required if `authType is set to `"password"`. | `""`, `"KeFg23!"` | initialOffset | N | The initial offset to use if no offset was previously committed. Should be "newest" or "oldest". Defaults to "newest". | `"oldest"` | maxMessageBytes | N | The maximum size in bytes allowed for a single Kafka message. Defaults to 1024. | `2048` | consumeRetryInterval | N | The interval between retries when attempting to consume topics. Treats numbers without suffix as milliseconds. Defaults to 100ms. | `200ms` | version | N | Kafka cluster version. Defaults to 2.0.0.0 | `0.10.2.0` | caCert | N | Certificate authority certificate, required for using TLS. Can be `secretKeyRef` to use a secret reference | `"-----BEGIN CERTIFICATE-----\n\n-----END CERTIFICATE-----"` -| clientCert | N | Client certificate, required for using TLS. Can be `secretKeyRef` to use a secret reference | `"-----BEGIN CERTIFICATE-----\n\n-----END CERTIFICATE-----"` -| clientKey | N | Client key, required for using TLS. Can be `secretKeyRef` to use a secret reference | `"-----BEGIN RSA PRIVATE KEY-----\n\n-----END RSA PRIVATE KEY-----"` +| clientCert | N | Client certificate, required for `authType` `mtls`. Can be `secretKeyRef` to use a secret reference | `"-----BEGIN CERTIFICATE-----\n\n-----END CERTIFICATE-----"` +| clientKey | N | Client key, required for `authType` `mtls` Can be `secretKeyRef` to use a secret reference | `"-----BEGIN RSA PRIVATE KEY-----\n\n-----END RSA PRIVATE KEY-----"` | skipVerify | N | Skip TLS verification, this is not recommended for use in production. Defaults to `"false"` | `"true"`, `"false"` | +| disableTls | N | Disable TLS for transport security. This is not recommended for use in production. Defaults to `"false"` | `"true"`, `"false"` | +| oidcTokenEndpoint | N | Full URL to an OAuth2 identity provider access token endpoint. Required when `authType` is set to `oidc` | "https://identity.example.com/v1/token" | +| oidcClientID | N | The OAuth2 client ID that has been provisioned in the identity provider. Required when `authType is set to `oidc` | `dapr-kafka` | +| oidcClientSecret | N | The OAuth2 client secret that has been provisioned in the identity provider: Required when `authType` is set to `oidc` | `"KeFg23!"` | +| oidcScopes | N | Comma-delimited list of OAuth2/OIDC scopes to request with the access token. Recommended when `authType` is set to `oidc`. Defaults to `"openid"` | '"openid,kafka-prod"` | -### Communication using TLS -To configure communication using TLS, ensure the Kafka broker is configured to support certificates. -Pre-requisite includes `certficate authority certificate`, `ca issued client certificate`, `client private key`. -Below is an example of a Kafka pubsub component configured to use TLS: + +The `secretKeyRef` above is referencing a [kubernetes secrets store]({{< ref kubernetes-secret-store.md >}}) to access the tls information. Visit [here]({{< ref setup-secret-store.md >}}) to learn more about how to configure a secret store component. + +### Authentication + +Kafka supports a variety of authentication schemes and Dapr supports several: SASL password, mTLS, OIDC/OAuth2. With the added authentication methods, the `authRequired` field has been deprecated +and instead the `authType` field should be used. If `authRequired` is set to `true`, Dapr will attempt to configure `authType` correctly based on the value of `saslPassword`. There are four valid values for `authType`: `none`, `password`, `mtls`, and `oidc`. Note this is authentication only; authorization is still configured within Kafka. + +#### None + +Setting `authType` to `none` will disable any authentication. This is *NOT* recommended in production. + +```yaml +apiVersion: dapr.io/v1alpha1 +kind: Component +metadata: + name: kafka-pubsub-noauth + namespace: default +spec: + type: pubsub.kafka + version: v1 + metadata: + - name: brokers # Required. Kafka broker connection setting + value: "dapr-kafka.myapp.svc.cluster.local:9092" + - name: consumerGroup # Optional. Used for input bindings. + value: "group1" + - name: clientID # Optional. Used as client tracing ID by Kafka brokers. + value: "my-dapr-app-id" + - name: authType # Required. + value: "none" + - name: maxMessageBytes # Optional. + value: 1024 + - name: consumeRetryInterval # Optional. + value: 200ms + - name: version # Optional. + value: 0.10.2.0 + - name: disableTls + value: "true" +``` + +#### SASL Password + +Setting `authType` to `password` will enable [SASL](https://en.wikipedia.org/wiki/Simple_Authentication_and_Security_Layer) authentication using the **PLAIN** mechanism. This requires setting +the `saslUsername` and `saslPassword` fields. + +```yaml +apiVersion: dapr.io/v1alpha1 +kind: Component +metadata: + name: kafka-pubsub-sasl + namespace: default +spec: + type: pubsub.kafka + version: v1 + metadata: + - name: brokers # Required. Kafka broker connection setting + value: "dapr-kafka.myapp.svc.cluster.local:9092" + - name: consumerGroup # Optional. Used for input bindings. + value: "group1" + - name: clientID # Optional. Used as client tracing ID by Kafka brokers. + value: "my-dapr-app-id" + - name: authType # Required. + value: "password" + - name: saslUsername # Required if authType is `password`. + value: "adminuser" + - name: saslPassword # Required if authType is `password`. + secretKeyRef: + name: kafka-secrets + key: saslPasswordSecret + - name: maxMessageBytes # Optional. + value: 1024 + - name: consumeRetryInterval # Optional. + value: 200ms + - name: version # Optional. + value: 0.10.2.0 + - name: caCert + secretKeyRef: + name: kafka-tls + key: caCert +``` + +#### Mutual TLS + +Setting `authType` to `mtls` will use a x509 client certificate (the `clientCert` field) and key (the `clientKey` field) to authenticate. Note that mTLS as an +authentication mechanism is distinct from using TLS to secure the transport layer via encryption. mTLS requires TLS transport (meaning `disableTls` must be `false`), but securing +the transport layer does not require using mTLS. See _Communication using TLS_ for configuring underlying TLS transport. + +```yaml +apiVersion: dapr.io/v1alpha1 +kind: Component +metadata: + name: kafka-pubsub-mtls + namespace: default +spec: + type: pubsub.kafka + version: v1 + metadata: + - name: brokers # Required. Kafka broker connection setting + value: "dapr-kafka.myapp.svc.cluster.local:9092" + - name: consumerGroup # Optional. Used for input bindings. + value: "group1" + - name: clientID # Optional. Used as client tracing ID by Kafka brokers. + value: "my-dapr-app-id" + - name: authType # Required. + value: "mtls" + - name: caCert + secretKeyRef: + name: kafka-tls + key: caCert + - name: clientCert + secretKeyRef: + name: kafka-tls + key: clientCert + - name: clientKey + secretKeyRef: + name: kafka-tls + key: clientKey + - name: maxMessageBytes # Optional. + value: 1024 + - name: consumeRetryInterval # Optional. + value: 200ms + - name: version # Optional. + value: 0.10.2.0 +``` + +#### OAuth2 or OpenID Connect + +Setting `authType` to `oidc` will enable SASL authentication via the **OAUTHBEARER** mechanism. This supports specifying a bearer +token from an external OAuth2 or [OIDC](https://en.wikipedia.org/wiki/OpenID) identity provider. Currenly only the **client_credentials** grant is supported. Configure `oidcTokenEndpoint` to +the full URL for the identity provider access token endpoint. Set `oidcClientID` and `oidcClientSecret` to the client credentials provisioned in the identity provider. If `caCert` +is specified in the component configuration, the certificate will be appended to the system CA trust for verifying the identity provider certificate. Similarly, if `skipVerify` +is specified in the component configuration, it will also be applied when accessing the identity provider. By default, the only scope requested for the token is `openid` but it is highly recommended +that additional scopes be specified via `oidcScopes` in a comma-separated list and validated by the Kafka broken. If additional scopes are not used to narrow the validity of the access token, +a compromised Kafka broker could replay the token to access other services as the Dapr clientID. ```yaml apiVersion: dapr.io/v1alpha1 @@ -83,9 +219,57 @@ spec: value: "group1" - name: clientID # Optional. Used as client tracing ID by Kafka brokers. value: "my-dapr-app-id" - - name: authRequired # Required. - value: "true" - - name: saslUsername # Required if authRequired is `true`. + - name: authType # Required. + value: "oidc" + - name: oidcTokenEndpoint # Required if authType is `oidc`. + value: "https://identity.example.com/v1/token" + - name: oidcClientID # Required if authType is `oidc`. + value: "dapr-myapp" + - name: oidcClientSecret # Required if authType is `oidc`. + secretKeyRef: + name: kafka-secrets + key: oidcClientSecret + - name: oidcScopes # Recommended if authType is `oidc`. + value: "openid,kafka-dev" + - name: caCert # Also applied to verifying OIDC provider certificate + secretKeyRef: + name: kafka-tls + key: caCert + - name: maxMessageBytes # Optional. + value: 1024 + - name: consumeRetryInterval # Optional. + value: 200ms + - name: version # Optional. + value: 0.10.2.0 +``` + +### Communication using TLS + +By default TLS is enabled to secure the transport layer to Kafka. To disable TLS, set `disableTls` to `true`. When TLS is enabled, you can +control server certificate verification using `skipVerify` to disable verificaiton (*NOT* recommended in production environments) and `caCert` to +specify a trusted TLS certificate authority (CA). If no `caCert` is specified, the system CA trust will be used. To also configure mTLS authentication, +see the section under _Authentication_. +Below is an example of a Kafka pubsub component configured to use transport layer TLS: + +```yaml +apiVersion: dapr.io/v1alpha1 +kind: Component +metadata: + name: kafka-pubsub + namespace: default +spec: + type: pubsub.kafka + version: v1 + metadata: + - name: brokers # Required. Kafka broker connection setting + value: "dapr-kafka.myapp.svc.cluster.local:9092" + - name: consumerGroup # Optional. Used for input bindings. + value: "group1" + - name: clientID # Optional. Used as client tracing ID by Kafka brokers. + value: "my-dapr-app-id" + - name: authType # Required. + value: "password" + - name: saslUsername # Required if authType is `password`. value: "adminuser" - name: consumeRetryInterval # Optional. value: 200ms @@ -101,21 +285,10 @@ spec: secretKeyRef: name: kafka-tls key: caCert - - name: clientCert # Client certificate. - secretKeyRef: - name: kafka-tls - key: clientCert - - name: clientKey # Client key. - secretKeyRef: - name: kafka-tls - key: clientKey auth: secretStore: ``` -The `secretKeyRef` above is referencing a [kubernetes secrets store]({{< ref kubernetes-secret-store.md >}}) to access the tls information. Visit [here]({{< ref setup-secret-store.md >}}) to learn more about how to configure a secret store component. - - ## Per-call metadata fields ### Partition Key @@ -154,4 +327,4 @@ To run Kafka on Kubernetes, you can use any Kafka operator, such as [Strimzi](ht ## Related links - [Basic schema for a Dapr component]({{< ref component-schema >}}) - Read [this guide]({{< ref "howto-publish-subscribe.md##step-1-setup-the-pubsub-component" >}}) for instructions on configuring pub/sub components -- [Pub/Sub building block]({{< ref pubsub >}}) \ No newline at end of file +- [Pub/Sub building block]({{< ref pubsub >}})