mirror of https://github.com/dapr/docs.git
Merge branch 'v1.13' into issue_3869
This commit is contained in:
commit
216679ba95
|
@ -52,6 +52,14 @@ For example:
|
|||
|
||||
For more information read [Pluggable components overview]({{< ref "pluggable-components-overview" >}})
|
||||
|
||||
## Hot Reloading
|
||||
|
||||
With the [`HotReload` feature enabled]({{< ref "support-preview-features.md" >}}), components are able to be "hot reloaded" at runtime.
|
||||
This means that you can update component configuration without restarting the Dapr runtime.
|
||||
Component reloading occurs when a component resource is created, updated, or deleted, either in the Kubernetes API or in self-hosted mode when a file is changed in the `resources` directory.
|
||||
When a component is updated, the component is first closed, and then reinitialized using the new configuration.
|
||||
The component is unavailable for a short period of time during reload and reinitialization.
|
||||
|
||||
## Available component types
|
||||
|
||||
The following are the component types provided by Dapr:
|
||||
|
|
|
@ -6,18 +6,27 @@ weight: 300
|
|||
description: "Updating deployed components used by applications"
|
||||
---
|
||||
|
||||
When making an update to an existing deployed component used by an application, Dapr does not update the component automatically. The Dapr sidecar needs to be restarted in order to pick up the latest version of the component. How this done depends on the hosting environment.
|
||||
When making an update to an existing deployed component used by an application, Dapr does not update the component automatically unless the `HotReload` feature gate is enabled.
|
||||
The Dapr sidecar needs to be restarted in order to pick up the latest version of the component.
|
||||
How this is done depends on the hosting environment.
|
||||
|
||||
{{% alert title="Note" color="primary" %}}
|
||||
Dapr can be made to "hot reload" components, where updates are picked up automatically without needing a restart.
|
||||
This is enabled by via the [`HotReload` feature gate]({{< ref "support-preview-features.md" >}}).
|
||||
All component types are supported for hot reloading.
|
||||
This feature is currently in preview.
|
||||
{{% /alert %}}
|
||||
|
||||
## Kubernetes
|
||||
|
||||
When running in Kubernetes, the process of updating a component involves two steps:
|
||||
|
||||
1. Applying the new component YAML to the desired namespace
|
||||
2. Performing a [rollout restart operation](https://kubernetes.io/docs/reference/kubectl/cheatsheet/#updating-resources) on your deployments to pick up the latest component
|
||||
1. Apply the new component YAML to the desired namespace
|
||||
1. Unless the [`HotReload` feature gate is enabled]({{< ref "support-preview-features.md" >}}), perform a [rollout restart operation](https://kubernetes.io/docs/reference/kubectl/cheatsheet/#updating-resources) on your deployments to pick up the latest component
|
||||
|
||||
## Self Hosted
|
||||
|
||||
When running in Self Hosted mode, the process of updating a component involves a single step of stopping the `daprd` process and starting it again to pick up the latest component.
|
||||
Unless the [`HotReload` feature gate is enabled]({{< ref "support-preview-features.md" >}}), the process of updating a component involves a single step of stopping and restarting the `daprd` process to pick up the latest component.
|
||||
|
||||
## Further reading
|
||||
- [Components concept]({{< ref components-concept.md >}})
|
||||
|
|
|
@ -22,4 +22,4 @@ For CLI there is no explicit opt-in, just the version that this was first made a
|
|||
| **Service invocation for non-Dapr endpoints** | Allow the invocation of non-Dapr endpoints by Dapr using the [Service invocation API]({{< ref service_invocation_api.md >}}). Read ["How-To: Invoke Non-Dapr Endpoints using HTTP"]({{< ref howto-invoke-non-dapr-endpoints.md >}}) for more information. | N/A | [Service invocation API]({{< ref service_invocation_api.md >}}) | v1.11 |
|
||||
| **Actor State TTL** | Allow actors to save records to state stores with Time To Live (TTL) set to automatically clean up old data. In its current implementation, actor state with TTL may not be reflected correctly by clients, read [Actor State Transactions]({{< ref actors_api.md >}}) for more information. | `ActorStateTTL` | [Actor State Transactions]({{< ref actors_api.md >}}) | v1.11 |
|
||||
| **Transactional Outbox** | Allows state operations for inserts and updates to be published to a configured pub/sub topic using a single transaction across the state store and the pub/sub | N/A | [Transactional Outbox Feature]({{< ref howto-outbox.md >}}) | v1.12 |
|
||||
|
||||
| **Component Hot Reloading** | Allows for Dapr-loaded components to be "hot reloaded". A component spec is reloaded when it is created/updated/deleted in Kubernetes or on file when running in self-hosted mode.| `HotReload`| [Hot Reloading]({{< ref components-concept.md >}}) | v1.13 |
|
||||
|
|
|
@ -49,6 +49,16 @@ spec:
|
|||
value: "2.0.0"
|
||||
- name: direction
|
||||
value: "input, output"
|
||||
- name: schemaRegistryURL # Optional. When using Schema Registry Avro serialization/deserialization. The Schema Registry URL.
|
||||
value: http://localhost:8081
|
||||
- name: schemaRegistryAPIKey # Optional. When using Schema Registry Avro serialization/deserialization. The Schema Registry API Key.
|
||||
value: XYAXXAZ
|
||||
- name: schemaRegistryAPISecret # Optional. When using Schema Registry Avro serialization/deserialization. The Schema Registry credentials API Secret.
|
||||
value: "ABCDEFGMEADFF"
|
||||
- name: schemaCachingEnabled # Optional. When using Schema Registry Avro serialization/deserialization. Enables caching for schemas.
|
||||
value: true
|
||||
- name: schemaLatestVersionCacheTTL # Optional. When using Schema Registry Avro serialization/deserialization. The TTL for schema caching when publishing a message with latest schema available.
|
||||
value: 5m
|
||||
```
|
||||
|
||||
## Spec metadata fields
|
||||
|
@ -75,6 +85,11 @@ spec:
|
|||
| `version` | N | Input/Output | Kafka cluster version. Defaults to 2.0.0. Please note that this needs to be mandatorily set to `1.0.0` for EventHubs with Kafka. | `"1.0.0"` |
|
||||
| `direction` | N | Input/Output | The direction of the binding. | `"input"`, `"output"`, `"input, output"` |
|
||||
| `oidcExtensions` | N | Input/Output | String containing a JSON-encoded dictionary of OAuth2/OIDC extensions to request with the access token | `{"cluster":"kafka","poolid":"kafkapool"}` |
|
||||
| `schemaRegistryURL` | N | Required when using Schema Registry Avro serialization/deserialization. The Schema Registry URL. | `http://localhost:8081` |
|
||||
| `schemaRegistryAPIKey` | N | When using Schema Registry Avro serialization/deserialization. The Schema Registry credentials API Key. | `XYAXXAZ` |
|
||||
| `schemaRegistryAPISecret` | N | When using Schema Registry Avro serialization/deserialization. The Schema Registry credentials API Secret. | `ABCDEFGMEADFF` |
|
||||
| `schemaCachingEnabled` | N | When using Schema Registry Avro serialization/deserialization. Enables caching for schemas. Default is `true` | `true` |
|
||||
| `schemaLatestVersionCacheTTL` | N | When using Schema Registry Avro serialization/deserialization. The TTL for schema caching when publishing a message with latest schema available. Default is 5 min | `5m` |
|
||||
|
||||
#### Note
|
||||
The metadata `version` must be set to `1.0.0` when using Azure EventHubs with Kafka.
|
||||
|
|
|
@ -49,6 +49,17 @@ spec:
|
|||
value: 2.0.0
|
||||
- name: disableTls # Optional. Disable TLS. This is not safe for production!! You should read the `Mutual TLS` section for how to use TLS.
|
||||
value: "true"
|
||||
- name: schemaRegistryURL # Optional. When using Schema Registry Avro serialization/deserialization. The Schema Registry URL.
|
||||
value: http://localhost:8081
|
||||
- name: schemaRegistryAPIKey # Optional. When using Schema Registry Avro serialization/deserialization. The Schema Registry API Key.
|
||||
value: XYAXXAZ
|
||||
- name: schemaRegistryAPISecret # Optional. When using Schema Registry Avro serialization/deserialization. The Schema Registry credentials API Secret.
|
||||
value: "ABCDEFGMEADFF"
|
||||
- name: schemaCachingEnabled # Optional. When using Schema Registry Avro serialization/deserialization. Enables caching for schemas.
|
||||
value: true
|
||||
- name: schemaLatestVersionCacheTTL # Optional. When using Schema Registry Avro serialization/deserialization. The TTL for schema caching when publishing a message with latest schema available.
|
||||
value: 5m
|
||||
|
||||
```
|
||||
|
||||
> For details on using `secretKeyRef`, see the guide on [how to reference secrets in components]({{< ref component-secrets.md >}}).
|
||||
|
@ -81,6 +92,11 @@ spec:
|
|||
| oidcClientSecret | N | The OAuth2 client secret that has been provisioned in the identity provider: Required when `authType` is set to `oidc` | `"KeFg23!"` |
|
||||
| oidcScopes | N | Comma-delimited list of OAuth2/OIDC scopes to request with the access token. Recommended when `authType` is set to `oidc`. Defaults to `"openid"` | `"openid,kafka-prod"` |
|
||||
| oidcExtensions | N | Input/Output | String containing a JSON-encoded dictionary of OAuth2/OIDC extensions to request with the access token | `{"cluster":"kafka","poolid":"kafkapool"}` |
|
||||
| schemaRegistryURL | N | Required when using Schema Registry Avro serialization/deserialization. The Schema Registry URL. | `http://localhost:8081` |
|
||||
| schemaRegistryAPIKey | N | When using Schema Registry Avro serialization/deserialization. The Schema Registry credentials API Key. | `XYAXXAZ` |
|
||||
| schemaRegistryAPISecret | N | When using Schema Registry Avro serialization/deserialization. The Schema Registry credentials API Secret. | `ABCDEFGMEADFF` |
|
||||
| schemaCachingEnabled | N | When using Schema Registry Avro serialization/deserialization. Enables caching for schemas. Default is `true` | `true` |
|
||||
| schemaLatestVersionCacheTTL | N | When using Schema Registry Avro serialization/deserialization. The TTL for schema caching when publishing a message with latest schema available. Default is 5 min | `5m` |
|
||||
|
||||
The `secretKeyRef` above is referencing a [kubernetes secrets store]({{< ref kubernetes-secret-store.md >}}) to access the tls information. Visit [here]({{< ref setup-secret-store.md >}}) to learn more about how to configure a secret store component.
|
||||
|
||||
|
@ -348,6 +364,103 @@ curl -X POST http://localhost:3500/v1.0/publish/myKafka/myTopic?metadata.correla
|
|||
}'
|
||||
```
|
||||
|
||||
## Avro Schema Registry serialization/deserialization
|
||||
You can configure pub/sub to publish or consume data encoded using [Avro binary serialization](https://avro.apache.org/docs/), leveraging an [Apache Schema Registry](https://developer.confluent.io/courses/apache-kafka/schema-registry/) (for example, [Confluent Schema Registry](https://developer.confluent.io/courses/apache-kafka/schema-registry/), [Apicurio](https://www.apicur.io/registry/)).
|
||||
|
||||
### Configuration
|
||||
|
||||
{{% alert title="Important" color="warning" %}}
|
||||
Currently, only message value serialization/deserialization is supported. Since cloud events are not supported, the `rawPayload=true` metadata must be passed.
|
||||
{{% /alert %}}
|
||||
|
||||
When configuring the Kafka pub/sub component metadata, you must define:
|
||||
- The schema registry URL
|
||||
- The API key/secret, if applicable
|
||||
|
||||
Schema subjects are automatically derived from topic names, using the standard naming convention. For example, for a topic named `my-topic`, the schema subject will be `my-topic-value`.
|
||||
When interacting with the message payload within the service, it is in JSON format. The payload is transparently serialized/deserialized within the Dapr component.
|
||||
Date/Datetime fields must be passed as their [Epoch Unix timestamp](https://en.wikipedia.org/wiki/Unix_time) equivalent (rather than typical Iso8601). For example:
|
||||
- `2024-01-10T04:36:05.986Z` should be passed as `1704861365986` (the number of milliseconds since Jan 1st, 1970)
|
||||
- `2024-01-10` should be passed as `19732` (the number of days since Jan 1st, 1970)
|
||||
|
||||
### Publishing Avro messages
|
||||
In order to indicate to the Kafka pub/sub component that the message should be using Avro serialization, the `valueSchemaType` metadata must be set to `Avro`.
|
||||
|
||||
{{< tabs curl "Python SDK">}}
|
||||
|
||||
{{% codetab %}}
|
||||
```bash
|
||||
curl -X "POST" http://localhost:3500/v1.0/publish/pubsub/my-topic?metadata.rawPayload=true&metadata.valueSchemaType=Avro -H "Content-Type: application/json" -d '{"order_number": "345", "created_date": 1704861365986}'
|
||||
```
|
||||
{{% /codetab %}}
|
||||
|
||||
{{% codetab %}}
|
||||
```python
|
||||
from dapr.clients import DaprClient
|
||||
|
||||
with DaprClient() as d:
|
||||
req_data = {
|
||||
'order_number': '345',
|
||||
'created_date': 1704861365986
|
||||
}
|
||||
# Create a typed message with content type and body
|
||||
resp = d.publish_event(
|
||||
pubsub_name='pubsub',
|
||||
topic_name='my-topic',
|
||||
data=json.dumps(req_data),
|
||||
publish_metadata={'rawPayload': 'true', 'valueSchemaType': 'Avro'}
|
||||
)
|
||||
# Print the request
|
||||
print(req_data, flush=True)
|
||||
```
|
||||
{{% /codetab %}}
|
||||
|
||||
{{< /tabs >}}
|
||||
|
||||
|
||||
### Subscribing to Avro topics
|
||||
In order to indicate to the Kafka pub/sub component that the message should be deserialized using Avro, the `valueSchemaType` metadata must be set to `Avro` in the subscription metadata.
|
||||
|
||||
{{< tabs "Python (FastAPI)" >}}
|
||||
|
||||
{{% codetab %}}
|
||||
|
||||
```python
|
||||
from fastapi import APIRouter, Body, Response, status
|
||||
import json
|
||||
import sys
|
||||
|
||||
app = FastAPI()
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
|
||||
@router.get('/dapr/subscribe')
|
||||
def subscribe():
|
||||
subscriptions = [{'pubsubname': 'pubsub',
|
||||
'topic': 'my-topic',
|
||||
'route': 'my_topic_subscriber',
|
||||
'metadata': {
|
||||
'rawPayload': 'true',
|
||||
'valueSchemaType': 'Avro',
|
||||
} }]
|
||||
return subscriptions
|
||||
|
||||
@router.post('/my_topic_subscriber')
|
||||
def my_topic_subscriber(event_data=Body()):
|
||||
print(event_data, flush=True)
|
||||
return Response(status_code=status.HTTP_200_OK)
|
||||
|
||||
app.include_router(router)
|
||||
|
||||
```
|
||||
|
||||
{{% /codetab %}}
|
||||
|
||||
{{< /tabs >}}
|
||||
|
||||
|
||||
|
||||
## Create a Kafka instance
|
||||
|
||||
{{< tabs "Self-Hosted" "Kubernetes">}}
|
||||
|
|
|
@ -1 +1 @@
|
|||
Subproject commit 99d874a2b138af020df099a0fc0a09a7d0597fae
|
||||
Subproject commit 10ef81873b3448fb136c73ad26a9fd2768954c2f
|
|
@ -1 +1 @@
|
|||
Subproject commit e16e0350a52349b5a05138edc0b58e3be78ee753
|
||||
Subproject commit 04f7b595b6d19bbf1c42a3364992016c3ae3e40e
|
|
@ -1 +1 @@
|
|||
Subproject commit 5e45aa86b81748bf1e6efdbf7f52c20645a12435
|
||||
Subproject commit 6759f19f8374c7c550c709b1fe8118ce738280a8
|
|
@ -1 +1 @@
|
|||
Subproject commit df7eff281a5a1395a7967c658a5707e8dfb2b99e
|
||||
Subproject commit 6e89215f5ca26f8f4d109424e2cad7792b9d8a28
|
|
@ -1 +1 @@
|
|||
Subproject commit 6171b67db60d51704ed8425ae71dda9226bf1255
|
||||
Subproject commit c08e71494a644f9ff875941c669c6a1e1f3a3340
|
Loading…
Reference in New Issue