Merge branch 'issue_4117' of https://github.com/hhunter-ms/docs into issue_4117

This commit is contained in:
Hannah Hunter 2024-09-11 12:26:15 -04:00
commit 54730f70e8
8 changed files with 50 additions and 53 deletions

View File

@ -2,47 +2,9 @@
type: docs
title: "Dapr Roadmap"
linkTitle: "Roadmap"
description: "The Dapr Roadmap is a tool to help with visibility into investments across the Dapr project"
description: "The Dapr Roadmap gives the community visibility into the different priorities of the projecs"
weight: 30
no_list: true
---
Dapr encourages the community to help with prioritization. A GitHub project board is available to view and provide feedback on proposed issues and track them across development.
[<img src="/images/roadmap.png" alt="Screenshot of the Dapr Roadmap board" width=500 >](https://aka.ms/dapr/roadmap)
{{< button text="View the backlog" link="https://aka.ms/dapr/roadmap" color="primary" >}}
<br />
Please vote by adding a 👍 on the GitHub issues for the feature capabilities you would most like to see Dapr support. This will help the Dapr maintainers understand which features will provide the most value.
Contributions from the community is also welcomed. If there are features on the roadmap that you are interested in contributing to, please comment on the GitHub issue and include your solution proposal.
{{% alert title="Note" color="primary" %}}
The Dapr roadmap includes issues only from the v1.2 release and onwards. Issues closed and released prior to v1.2 are not included.
{{% /alert %}}
## Stages
The Dapr Roadmap progresses through the following stages:
{{< cardpane >}}
{{< card title="**[📄 Backlog](https://github.com/orgs/dapr/projects/52#column-14691591)**" >}}
Issues (features) that need 👍 votes from the community to prioritize. Updated by Dapr maintainers.
{{< /card >}}
{{< card title="**[⏳ Planned (Committed)](https://github.com/orgs/dapr/projects/52#column-14561691)**" >}}
Issues with a proposal and/or targeted release milestone. This is where design proposals are discussed and designed.
{{< /card >}}
{{< card title="**[👩‍💻 In Progress (Development)](https://github.com/orgs/dapr/projects/52#column-14561696)**" >}}
Implementation specifics have been agreed upon and the feature is under active development.
{{< /card >}}
{{< /cardpane >}}
{{< cardpane >}}
{{< card title="**[☑ Done](https://github.com/orgs/dapr/projects/52#column-14561700)**" >}}
The feature capability has been completed and is scheduled for an upcoming release.
{{< /card >}}
{{< card title="**[✅ Released](https://github.com/orgs/dapr/projects/52#column-14659973)**" >}}
The feature is released and available for use.
{{< /card >}}
{{< /cardpane >}}
See [this document](https://github.com/dapr/community/blob/master/roadmap.md) to view the Dapr project's roadmap.

View File

@ -26,6 +26,4 @@ By studying past resource behavior, recommend application resource optimization
The application graph facilitates collaboration between dev and ops by providing a dynamic overview of your services and infrastructure components.
Try out [Conductor Free](https://www.diagrid.io/pricing), ideal for individual developers building and testing Dapr applications on Kubernetes.
{{< button text="Learn more about Diagrid Conductor" link="https://www.diagrid.io/conductor" >}}

View File

@ -45,6 +45,7 @@ The table below shows the versions of Dapr releases that have been tested togeth
| Release date | Runtime | CLI | SDKs | Dashboard | Status | Release notes |
|--------------------|:--------:|:--------|---------|---------|---------|------------|
| August 14th 2024 | 1.14.1</br> | 1.14.1 | Java 1.12.0 </br>Go 1.11.0 </br>PHP 1.2.0 </br>Python 1.14.0 </br>.NET 1.14.0 </br>JS 3.3.1 | 0.15.0 | Supported (current) | [v1.14.1 release notes](https://github.com/dapr/dapr/releases/tag/v1.14.1) |
| August 14th 2024 | 1.14.0</br> | 1.14.0 | Java 1.12.0 </br>Go 1.11.0 </br>PHP 1.2.0 </br>Python 1.14.0 </br>.NET 1.14.0 </br>JS 3.3.1 | 0.15.0 | Supported (current) | [v1.14.0 release notes](https://github.com/dapr/dapr/releases/tag/v1.14.0) |
| May 29th 2024 | 1.13.4</br> | 1.13.0 | Java 1.11.0 </br>Go 1.10.0 </br>PHP 1.2.0 </br>Python 1.13.0 </br>.NET 1.13.0 </br>JS 3.3.0 | 0.14.0 | Supported | [v1.13.4 release notes](https://github.com/dapr/dapr/releases/tag/v1.13.4) |
| May 21st 2024 | 1.13.3</br> | 1.13.0 | Java 1.11.0 </br>Go 1.10.0 </br>PHP 1.2.0 </br>Python 1.13.0 </br>.NET 1.13.0 </br>JS 3.3.0 | 0.14.0 | Supported | [v1.13.3 release notes](https://github.com/dapr/dapr/releases/tag/v1.13.3) |
@ -139,7 +140,7 @@ General guidance on upgrading can be found for [self hosted mode]({{< ref self-h
| 1.11.0 to 1.11.4 | N/A | 1.12.4 |
| 1.12.0 to 1.12.4 | N/A | 1.13.5 |
| 1.13.0 to 1.13.5 | N/A | 1.14.0 |
| 1.14.0 | N/A | 1.14.0 |
| 1.14.0 to 1.14.1 | N/A | 1.14.1 |
## Upgrade on Hosting platforms

View File

@ -52,7 +52,7 @@ The people who should have access to read your security report are listed in [`m
code which allows the issue to be reproduced. Explain why you believe this
to be a security issue in Dapr.
2. Put that information into an email. Use a descriptive title.
3. Send the email to [Dapr Maintainers (dapr@dapr.io)](mailto:dapr@dapr.io?subject=[Security%20Disclosure]:%20ISSUE%20TITLE)
3. Send an email to [Security (security@dapr.io)](mailto:security@dapr.io?subject=[Security%20Disclosure]:%20ISSUE%20TITLE)
## Response

View File

@ -63,13 +63,11 @@ Entry | Description | Equivalent
```json
{
"job": {
"data": {
"data": {
"@type": "type.googleapis.com/google.protobuf.StringValue",
"value": "\"someData\""
},
"dueTime": "30s"
}
}
```
@ -90,14 +88,12 @@ $ curl -X POST \
http://localhost:3500/v1.0-alpha1/jobs/jobforjabba \
-H "Content-Type: application/json"
-d '{
"job": {
"data": {
"data": {
"@type": "type.googleapis.com/google.protobuf.StringValue",
"value": "Running spice"
},
"schedule": "@every 1m",
"repeats": 5
}
"schedule": "@every 1m",
"repeats": 5
}'
```

View File

@ -63,6 +63,8 @@ spec:
value: true
- name: schemaLatestVersionCacheTTL # Optional. When using Schema Registry Avro serialization/deserialization. The TTL for schema caching when publishing a message with latest schema available.
value: 5m
- name: escapeHeaders # Optional.
value: false
```
## Spec metadata fields
@ -99,6 +101,7 @@ spec:
| `consumerFetchDefault` | N | Input/Output | The default number of message bytes to fetch from the broker in each request. Default is `"1048576"` bytes. | `"2097152"` |
| `heartbeatInterval` | N | Input | The interval between heartbeats to the consumer coordinator. At most, the value should be set to a 1/3 of the `sessionTimeout` value. Defaults to `"3s"`. | `"5s"` |
| `sessionTimeout` | N | Input | The timeout used to detect client failures when using Kafkas group management facility. If the broker fails to receive any heartbeats from the consumer before the expiration of this session timeout, then the consumer is removed and initiates a rebalance. Defaults to `"10s"`. | `"20s"` |
| `escapeHeaders` | N | Input | Enables URL escaping of the message header values received by the consumer. Allows receiving content with special characters that are usually not allowed in HTTP headers. Default is `false`. | `true` |
#### Note
The metadata `version` must be set to `1.0.0` when using Azure EventHubs with Kafka.

View File

@ -63,6 +63,8 @@ spec:
value: true
- name: schemaLatestVersionCacheTTL # Optional. When using Schema Registry Avro serialization/deserialization. The TTL for schema caching when publishing a message with latest schema available.
value: 5m
- name: escapeHeaders # Optional.
value: false
```
@ -112,6 +114,7 @@ spec:
| consumerFetchDefault | N | The default number of message bytes to fetch from the broker in each request. Default is `"1048576"` bytes. | `"2097152"` |
| heartbeatInterval | N | The interval between heartbeats to the consumer coordinator. At most, the value should be set to a 1/3 of the `sessionTimeout` value. Defaults to "3s". | `"5s"` |
| sessionTimeout | N | The timeout used to detect client failures when using Kafkas group management facility. If the broker fails to receive any heartbeats from the consumer before the expiration of this session timeout, then the consumer is removed and initiates a rebalance. Defaults to "10s". | `"20s"` |
| escapeHeaders | N | Enables URL escaping of the message header values received by the consumer. Allows receiving content with special characters that are usually not allowed in HTTP headers. Default is `false`. | `true` |
The `secretKeyRef` above is referencing a [kubernetes secrets store]({{< ref kubernetes-secret-store.md >}}) to access the tls information. Visit [here]({{< ref setup-secret-store.md >}}) to learn more about how to configure a secret store component.
@ -485,6 +488,39 @@ curl -X POST http://localhost:3500/v1.0/publish/myKafka/myTopic?metadata.correla
}'
```
## Receiving message headers with special characters
The consumer application may be required to receive message headers that include special characters, which may cause HTTP protocol validation errors.
HTTP header values must follow specifications, making some characters not allowed. [Learn more about the protocols](https://www.w3.org/Protocols/rfc2616/rfc2616-sec4.html#sec4.2).
In this case, you can enable `escapeHeaders` configuration setting, which uses URL escaping to encode header values on the consumer side.
{{% alert title="Note" color="primary" %}}
When using this setting, the received message headers are URL escaped, and you need to URL "un-escape" it to get the original value.
{{% /alert %}}
Set `escapeHeaders` to `true` to URL escape.
```yaml
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
name: kafka-pubsub-escape-headers
spec:
type: pubsub.kafka
version: v1
metadata:
- name: brokers # Required. Kafka broker connection setting
value: "dapr-kafka.myapp.svc.cluster.local:9092"
- name: consumerGroup # Optional. Used for input bindings.
value: "group1"
- name: clientID # Optional. Used as client tracing ID by Kafka brokers.
value: "my-dapr-app-id"
- name: authType # Required.
value: "none"
- name: escapeHeaders
value: "true"
```
## Avro Schema Registry serialization/deserialization
You can configure pub/sub to publish or consume data encoded using [Avro binary serialization](https://avro.apache.org/docs/), leveraging an [Apache Schema Registry](https://developer.confluent.io/courses/apache-kafka/schema-registry/) (for example, [Confluent Schema Registry](https://developer.confluent.io/courses/apache-kafka/schema-registry/), [Apicurio](https://www.apicur.io/registry/)).
@ -597,6 +633,7 @@ To run Kafka on Kubernetes, you can use any Kafka operator, such as [Strimzi](ht
{{< /tabs >}}
## Related links
- [Basic schema for a Dapr component]({{< ref component-schema >}})
- Read [this guide]({{< ref "howto-publish-subscribe.md##step-1-setup-the-pubsub-component" >}}) for instructions on configuring pub/sub components

View File

@ -1 +1 @@
{{- if .Get "short" }}1.14{{ else if .Get "long" }}1.14.0{{ else if .Get "cli" }}1.14.0{{ else }}1.14.0{{ end -}}
{{- if .Get "short" }}1.14{{ else if .Get "long" }}1.14.1{{ else if .Get "cli" }}1.14.1{{ else }}1.14.1{{ end -}}