mirror of https://github.com/knative/docs.git
Get rid of refs to old KafkaChannels (#5009)
This commit is contained in:
parent
55459316be
commit
42fd74d11a
|
|
@ -89,12 +89,11 @@ default Channel configuration for Knative Eventing.
|
|||
```bash
|
||||
...
|
||||
__consumer_offsets
|
||||
knative-messaging-kafka.default.my-kafka-channel
|
||||
knative-messaging-kafka.default.testchannel-one
|
||||
...
|
||||
```
|
||||
|
||||
The Kafka Topic that is created by the Channel contains the name of the namespace, `default` in this example, followed by the name of the Channel. In the consolidated Channel implementation, it is also prefixed with `knative-messaging-kafka` to indicate that it is a Kafka Channel from Knative.
|
||||
The Kafka Topic that is created by the Channel contains the name of the namespace, `default` in this example, followed by the name of the Channel.
|
||||
|
||||
**Note:** The topic of a Kafka Channel is an implementation detail and records from it should not be consumed from different applications.
|
||||
|
||||
|
|
@ -234,131 +233,18 @@ The following example uses a ApiServerSource to publish events to an existing Br
|
|||
kubectl logs --selector='serving.knative.dev/service=broker-kafka-display' -c user-container
|
||||
```
|
||||
|
||||
## Authentication against an Apache Kafka cluster
|
||||
|
||||
In production environments it is common that the Apache Kafka cluster is
|
||||
secured using [TLS](http://kafka.apache.org/documentation/#security_ssl)
|
||||
or [SASL](http://kafka.apache.org/documentation/#security_sasl). This section
|
||||
shows how to configure a Kafka Channel to work against a protected Apache
|
||||
Kafka cluster, with the two supported TLS and SASL authentication methods.
|
||||
|
||||
**Note:** Kafka Channels require certificates to be in `.pem` format. If your files are in a different format, you must convert them to `.pem`.
|
||||
|
||||
### TLS authentication
|
||||
|
||||
1. Edit the `config-kafka` ConfigMap:
|
||||
|
||||
```bash
|
||||
kubectl -n knative-eventing edit configmap config-kafka
|
||||
```
|
||||
|
||||
1. Set the `TLS.Enable` field to `true`:
|
||||
|
||||
```yaml
|
||||
...
|
||||
data:
|
||||
sarama: |
|
||||
config: |
|
||||
Net:
|
||||
TLS:
|
||||
Enable: true
|
||||
...
|
||||
```
|
||||
|
||||
1. Optional: If you are using a custom CA certificate, add your certificate data to the ConfigMap in the `data.sarama.config.Net.TLS.Config.RootPEMs` field:
|
||||
|
||||
```yaml
|
||||
...
|
||||
data:
|
||||
sarama: |
|
||||
config: |
|
||||
Net:
|
||||
TLS:
|
||||
Config:
|
||||
RootPEMs: # Array of Root Certificate PEM Files (Use '|-' Syntax To Preserve Linefeeds & Avoiding Terminating \n)
|
||||
- |-
|
||||
-----BEGIN CERTIFICATE-----
|
||||
MIIGDzCCA/egAwIBAgIUWq6j7u/25wPQiNMPZqL6Vy0rkvQwDQYJKoZIhvcNAQEL
|
||||
...
|
||||
771uezZAFqd1GLLL8ZYRmCsAMg==
|
||||
-----END CERTIFICATE-----
|
||||
...
|
||||
```
|
||||
|
||||
### SASL authentication
|
||||
|
||||
To use SASL authentication, you will need the following information:
|
||||
|
||||
* A username and password.
|
||||
* The type of SASL mechanism you wish to use. For example; `PLAIN`, `SCRAM-SHA-256` or `SCRAM-SHA-512`.
|
||||
|
||||
**Note:** It is recommended to also enable TLS as described in the previous section.
|
||||
|
||||
1. Edit the `config-kafka` ConfigMap:
|
||||
|
||||
```bash
|
||||
kubectl -n knative-eventing edit configmap config-kafka
|
||||
```
|
||||
|
||||
1. Set the `SASL.Enable` field to `true`:
|
||||
|
||||
```yaml
|
||||
...
|
||||
data:
|
||||
sarama: |
|
||||
config: |
|
||||
Net:
|
||||
SASL:
|
||||
Enable: true
|
||||
...
|
||||
```
|
||||
|
||||
1. Create a secret that uses the username, password, and SASL mechanism:
|
||||
|
||||
```bash
|
||||
kubectl create secret --namespace <namespace> generic <kafka-auth-secret> \
|
||||
--from-literal=password="SecretPassword" \
|
||||
--from-literal=saslType="PLAIN" \
|
||||
--from-literal=username="my-sasl-user"
|
||||
```
|
||||
|
||||
### All authentication methods
|
||||
|
||||
1. If you have created a secret for your desired authentication method by using the previous steps, reference the secret and the namespace of the secret in the `config-kafka` ConfigMap:
|
||||
|
||||
```yaml
|
||||
...
|
||||
data:
|
||||
eventing-kafka: |
|
||||
kafka:
|
||||
authSecretName: <kafka-auth-secret>
|
||||
authSecretNamespace: <namespace>
|
||||
...
|
||||
```
|
||||
|
||||
**Note:** The default secret name and namespace are `kafka-cluster` and `knative-eventing` respectively. If you reference a secret in a different namespace, make sure you configure your roles and bindings so that the `knative-eventing` Pods can access it.
|
||||
|
||||
## Channel configuration
|
||||
|
||||
The `config-kafka` ConfigMap allows for a variety of Channel options such as:
|
||||
KafkaChannel provides various configuration options, such as:
|
||||
|
||||
- CPU and Memory requests and limits for the dispatcher (and receiver for
|
||||
the distributed Channel type) deployments created by the controller
|
||||
- Authentication
|
||||
|
||||
- Encryption
|
||||
|
||||
- Kafka Topic default values (number of partitions, replication factor, and
|
||||
retention time)
|
||||
|
||||
- Maximum idle connections/connections per host for Knative cloudevents
|
||||
|
||||
- The brokers string for your Kafka connection
|
||||
|
||||
- The name and namespace of your TLS/SASL authentication secret
|
||||
|
||||
- The Kafka admin type (distributed channel only)
|
||||
|
||||
- Nearly all the settings exposed in a [Sarama Config Struct](https://github.com/Shopify/sarama/blob/master/config.go)
|
||||
|
||||
- Sarama debugging assistance (via sarama.enableLogging)
|
||||
|
||||
For detailed information (particularly for the distributed channel), see the
|
||||
[Distributed Channel README](https://github.com/knative-sandbox/eventing-kafka/blob/main/config/channel/distributed/README.md)
|
||||
For detailed information on configuration options, see the
|
||||
[KafkaChannel README](https://github.com/knative-sandbox/eventing-kafka-broker/blob/main/docs/channel/README.md).
|
||||
|
|
|
|||
|
|
@ -1,75 +0,0 @@
|
|||
# ResetOffset Example
|
||||
|
||||
Kafka-backed Channels differ from other Channel implementations because they provide temporal persistence of events.
|
||||
|
||||
Event persistence enables certain advanced use cases to be supported, such as the ability to replay prior events by repositioning their current location in the event stream. This feature is useful when attempting to recover from unexpected Subscriber downtime.
|
||||
|
||||
The [ResetOffset custom resource definition (CRD)](https://github.com/knative-sandbox/eventing-kafka/tree/main/config/command/resetoffset) exposes the ability to manipulate the location of the ConsumerGroup Offsets in the event stream of a given Knative Subscription. Without the ResetOffset CRD, you must manually stop ConsumerGroups and manipulate the Offsets.
|
||||
|
||||
**Note:** ResetOffsets are currently only supported by the Distributed KafkaChannel implementation.
|
||||
|
||||
**Warning:** Repositioning the ConsumerGroup Offsets impacts the event ordering and is intended for failure recovery scenarios. This capability needs to be used with caution only after reviewing the [CRD documentation](https://github.com/knative-sandbox/eventing-kafka/tree/main/config/command/resetoffset).
|
||||
|
||||
## Prerequisites
|
||||
|
||||
- A Kubernetes cluster with the [Kafka Channel implementation](https://knative.dev/docs/eventing/channels/channels-crds/) installed.
|
||||
- A valid KafkaChannel resource and Subscription exist.
|
||||
|
||||
**Note:** The ResetOffset CRD is a single-use operation, and should be deleted from the cluster once it has been executed.
|
||||
|
||||
## Repositioning offsets to the oldest available event
|
||||
|
||||
The following ResetOffset instance moves the offsets back to the oldest
|
||||
available event in the Kafka Topic retention window:
|
||||
|
||||
```yaml
|
||||
apiVersion: kafka.eventing.knative.dev/v1alpha1
|
||||
kind: ResetOffset
|
||||
metadata:
|
||||
name: <resetoffset-name>
|
||||
namespace: <namespace>
|
||||
spec:
|
||||
offset:
|
||||
time: earliest
|
||||
ref:
|
||||
apiVersion: messaging.knative.dev/v1
|
||||
kind: Subscription
|
||||
namespace: <subscription-namespace>
|
||||
name: <subscription-name>
|
||||
```
|
||||
|
||||
Where:
|
||||
|
||||
- `<resetoffset-name>` is the name of the ResetOffset CRD.
|
||||
- `<namespace>` is the namespace where you want to create the ResetOffset CRD.
|
||||
- `<subscription-namespace>` is the namespace where your Subscription exists.
|
||||
- `<subscription-name>` is the name of the Subscription.
|
||||
|
||||
## Repositioning offsets to a specific point in the event stream
|
||||
|
||||
The following ResetOffset instance moves the offsets back to the specified
|
||||
point in the Kafka Topic retention window.
|
||||
|
||||
```yaml
|
||||
apiVersion: kafka.eventing.knative.dev/v1alpha1
|
||||
kind: ResetOffset
|
||||
metadata:
|
||||
name: <resetoffset-name>
|
||||
namespace: <namespace>
|
||||
spec:
|
||||
offset:
|
||||
time: <offset-time>
|
||||
ref:
|
||||
apiVersion: messaging.knative.dev/v1
|
||||
kind: Subscription
|
||||
namespace: <subscription-namespace>
|
||||
name: <subscription-name>
|
||||
```
|
||||
|
||||
Where:
|
||||
|
||||
- `<resetoffset-name>` is the name of the ResetOffset CRD.
|
||||
- `<offset-time>` is the specified offset time. You will need to set an [RFC3339](https://datatracker.ietf.org/doc/html/rfc3339) timestamp relative to your use case, for example `"2021-06-17T17:30:00Z"`.
|
||||
- `<namespace>` is the namespace where you want to create the ResetOffset CRD.
|
||||
- `<subscription-namespace>` is the namespace where your Subscription exists.
|
||||
- `<subscription-name>` is the name of the Subscription.
|
||||
|
|
@ -127,7 +127,6 @@ plugins:
|
|||
eventing/samples/helloworld/helloworld-python/README.md: https://github.com/knative/docs/tree/main/code-samples/eventing/helloworld/helloworld-python
|
||||
eventing/samples/kafka/binding/README.md: https://github.com/knative/docs/tree/main/code-samples/eventing/kafka/binding
|
||||
eventing/samples/kafka/channel/README.md: https://github.com/knative/docs/tree/main/code-samples/eventing/kafka/channel
|
||||
eventing/samples/kafka/resetoffset/README.md: https://github.com/knative/docs/tree/main/code-samples/eventing/kafka/resetoffset
|
||||
eventing/samples/kubernetes-event-source/index.md: eventing/sources/apiserversource/README.md
|
||||
eventing/samples/parallel/README.md: https://github.com/knative/docs/tree/main/code-samples/eventing/parallel
|
||||
eventing/samples/parallel/multiple-branches/README.md: https://github.com/knative/docs/tree/main/code-samples/eventing/parallel/multiple-branches
|
||||
|
|
|
|||
|
|
@ -20,7 +20,5 @@ Name | Status | Support | Description
|
|||
--- | --- | --- | ---
|
||||
[GCP PubSub](https://github.com/google/knative-gcp) | Proof of Concept | None | Channels are backed by [GCP PubSub](https://cloud.google.com/pubsub/).
|
||||
[InMemoryChannel](https://github.com/knative/eventing/tree/{{ branch }}/config/channels/in-memory-channel/README.md) | Proof of Concept | None | In-memory channels are a best effort Channel. They should NOT be used in Production. They are useful for development.
|
||||
[KafkaChannel - Consolidated](https://github.com/knative-sandbox/eventing-kafka/tree/{{ branch }}/pkg/channel/consolidated/README.md) | Proof of Concept | None | Channels are backed by [Apache Kafka](http://kafka.apache.org/) topics. The original Knative KafkaChannel implementation which utilizes a single combined Kafka Producer / Consumer deployment.
|
||||
[KafkaChannel - Distributed](https://github.com/knative-sandbox/eventing-kafka/tree/{{ branch }}/pkg/channel/distributed/README.md) | Proof of Concept | None | Channels are backed by [Apache Kafka](http://kafka.apache.org/) topics. An alternate KafkaChannel implementation, contributed by SAP's [Kyma](https://kyma-project.io/) project, which provides a more granular deployment of Producers / Consumers.
|
||||
[KafkaChannel - New](https://github.com/knative-sandbox/eventing-kafka-broker/tree/{{ branch }}/docs/channel/README.md) | Proof of Concept | None | Channels are backed by [Apache Kafka](http://kafka.apache.org/) topics. A new KafkaChannel implementation, which provides granular deployments and better scalability.
|
||||
[KafkaChannel](https://github.com/knative-sandbox/eventing-kafka-broker/tree/{{ branch }}/docs/channel/README.md) | Proof of Concept | None | Channels are backed by [Apache Kafka](http://kafka.apache.org/) topics.
|
||||
[NatssChannel](https://github.com/knative-sandbox/eventing-natss/tree/{{ branch }}/config/README.md) | Proof of Concept | None | Channels are backed by [NATS Streaming](https://github.com/nats-io/nats-streaming-server#configuring).
|
||||
|
|
|
|||
|
|
@ -17,7 +17,6 @@ See [all Knative code samples](https://github.com/knative/docs/tree/main/code-sa
|
|||
| GitLab source | Shows how to wire GitLab events for consumption by a Knative Service. | [YAML](https://github.com/knative/docs/tree/main/code-samples/eventing/gitlab-source) |
|
||||
| Apache Kafka Binding | KafkaBinding is responsible for injecting Kafka bootstrap connection information into a Kubernetes resource that embed a PodSpec (as `spec.template.spec`). This enables easy bootstrapping of a Kafka client. | [YAML](https://github.com/knative/docs/tree/main/code-samples/eventing/kafka/binding) |
|
||||
| Apache Kafka Channel | Install and configure the Apache Kafka Channel as the default Channel configuration for Knative Eventing. | [YAML](https://github.com/knative/docs/tree/main/code-samples/eventing/kafka/channel) |
|
||||
| ResetOffset | Only supported by the Distributed KafkaChannel implementation. The [ResetOffset custom resource definition (CRD)](https://github.com/knative-sandbox/eventing-kafka/tree/main/config/command/resetoffset) exposes the ability to manipulate the location of the ConsumerGroup Offsets in the event stream of a given Knative Subscription. Without the ResetOffset CRD, you must manually stop ConsumerGroups and manipulate the Offsets. | [YAML](https://github.com/knative/docs/tree/main/code-samples/eventing/kafka/resetoffset) |
|
||||
| Writing an event source using JavaScript | This tutorial provides instructions to build an event source in JavaScript and implement it with a ContainerSource or SinkBinding. | [JavaScript](https://github.com/knative/docs/tree/main/code-samples/eventing/writing-event-source-easy-way) |
|
||||
| Parallel with multiple cases | Create a Parallel with two branches. | [YAML](https://github.com/knative/docs/tree/main/code-samples/eventing/parallel/multiple-branches) |
|
||||
| Parallel with mutually exclusive cases | Create a Parallel with mutually exclusive branches. | [YAML](https://github.com/knative/docs/tree/main/code-samples/eventing/parallel/mutual-exclusivity) |
|
||||
|
|
|
|||
Loading…
Reference in New Issue