Merge branch 'v1.9' into cassandraAlpha

This commit is contained in:
greenie-msft 2022-08-02 16:56:32 -07:00 committed by GitHub
commit 4e66b809eb
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
6 changed files with 37 additions and 26 deletions

View File

@ -1,13 +1,13 @@
name: Azure Static Web App v1.8
name: Azure Static Web App v1.9
on:
push:
branches:
- v1.8
- v1.9
pull_request:
types: [opened, synchronize, reopened, closed]
branches:
- v1.8
- v1.9
jobs:
build_and_deploy_job:
@ -28,7 +28,7 @@ jobs:
HUGO_ENV: production
HUGO_VERSION: "0.100.2"
with:
azure_static_web_apps_api_token: ${{ secrets.AZURE_STATIC_WEB_APPS_V1_8 }}
azure_static_web_apps_api_token: ${{ secrets.AZURE_STATIC_WEB_APPS_V1_9 }}
repo_token: ${{ secrets.GITHUB_TOKEN }} # Used for Github integrations (i.e. PR comments)
skip_deploy_on_missing_secrets: true
action: "upload"
@ -49,6 +49,6 @@ jobs:
id: closepullrequest
uses: Azure/static-web-apps-deploy@v0.0.1-preview
with:
azure_static_web_apps_api_token: ${{ secrets.AZURE_STATIC_WEB_APPS_V1_8 }}
azure_static_web_apps_api_token: ${{ secrets.AZURE_STATIC_WEB_APPS_V1_9 }}
skip_deploy_on_missing_secrets: true
action: "close"

View File

@ -1,5 +1,5 @@
# Site Configuration
baseURL = "https://docs.dapr.io/"
baseURL = "https://v1-9.docs.dapr.io/"
title = "Dapr Docs"
theme = "docsy"
disableFastRender = true
@ -160,20 +160,20 @@ offlineSearch = false
github_repo = "https://github.com/dapr/docs"
github_project_repo = "https://github.com/dapr/dapr"
github_subdir = "daprdocs"
github_branch = "v1.7"
github_branch = "v1.9"
# Versioning
version_menu = "v1.8 (latest)"
version = "v1.8"
version_menu = "v1.9 (preview)"
version = "v1.9"
archived_version = false
url_latest_version = "https://docs.dapr.io"
[[params.versions]]
version = "v1.9 (preview)"
url = "https://v1-9.docs.dapr.io"
url = "#"
[[params.versions]]
version = "v1.8 (latest)"
url = "#"
url = "https://docs.dapr.io"
[[params.versions]]
version = "v1.7"
url = "https://v1-7.docs.dapr.io"

View File

@ -7,7 +7,6 @@ description: "Runtime and SDK release support and upgrade policies "
---
## Introduction
This topic details the supported versions of Dapr releases, the upgrade policies and how deprecations and breaking changes are communicated in all Dapr repositories (runtime, CLI, SDKs, etc) at versions 1.x and above.
Dapr releases use `MAJOR.MINOR.PATCH` versioning. For example, 1.0.0.
@ -124,6 +123,7 @@ Deprecations appear in release notes under a section named “Deprecations”, w
After announcing a future breaking change, the change will happen in 2 releases or 6 months, whichever is greater. Deprecated features should respond with warning but do nothing otherwise.
### Announced deprecations
| Feature | Deprecation announcement | Removal |

View File

@ -21,19 +21,27 @@ dapr publish [flags]
### Flags
| Name | Environment Variable | Default | Description |
| ------------------------ | -------------------- | ------------------------------------------------------------ | ----------------------------------------------------- |
| `--publish-app-id`, `-i` | | The ID that represents the app from which you are publishing |
| `--pubsub`, `-p` | | The name of the pub/sub component |
| `--topic`, `-t` | | | The topic to be published to |
| `--data`, `-d` | | | The JSON serialized string (optional) |
| `--data-file`, `-f` | | | A file containing the JSON serialized data (optional) |
| `--help`, `-h` | | | Print this help message |
| Name | Environment Variable | Default | Description |
| ---------------------------- | -------------------- | ------------------------------------------------------------ | ----------------------------------------------------- |
| `--publish-app-id`, `-i` | | The ID that represents the app from which you are publishing |
| `--pubsub`, `-p` | | The name of the pub/sub component |
| `--topic`, `-t` | | | The topic to be published to |
| `--data`, `-d` | | | The JSON serialized string (optional) |
| `--data-file`, `-f` | | | A file containing the JSON serialized data (optional) |
| `--help`, `-h` | | | Print this help message |
| `--metadata`, `-m` | | | A JSON serialized publish metadata (optional) |
| `--unix-domain-socket`, `-u` | | | The path to the unix domain socket (optional) |
### Examples
```bash
# Publish to sample topic in target pubsub
# Publish to sample topic in target pubsub via a publishing app
dapr publish --publish-app-id appId --topic sample --pubsub target --data '{"key":"value"}'
# Publish to sample topic in target pubsub via a publishing app using Unix domain socket
dapr publish --enable-domain-socket --publish-app-id myapp --pubsub target --topic sample --data '{"key":"value"}'
# Publish to sample topic in target pubsub via a publishing app without cloud event
dapr publish --publish-app-id myapp --pubsub target --topic sample --data '{"key":"value"}' --metadata '{"rawPayload":"true"}'
```

View File

@ -9,7 +9,7 @@ description: "Detailed documentation on the Zeebe command binding component"
To setup Zeebe command binding create a component of type `bindings.zeebe.command`. See [this guide]({{< ref "howto-bindings.md#1-create-a-binding" >}}) on how to create and apply a binding configuration.
See [this](https://docs.camunda.io/docs/product-manuals/zeebe/zeebe-overview) for Zeebe documentation.
See [this](https://docs.camunda.io/docs/components/zeebe/zeebe-overview/) for Zeebe documentation.
```yaml
apiVersion: dapr.io/v1alpha1
@ -58,7 +58,7 @@ This component supports **output binding** with the following operations:
### Output binding
Zeebe uses gRPC under the hood for the Zeebe client we use in this binding. Please consult the [gRPC API reference](https://stage.docs.zeebe.io/reference/grpc.html) for more information.
Zeebe uses gRPC under the hood for the Zeebe client we use in this binding. Please consult the [gRPC API reference](https://docs.camunda.io/docs/apis-clients/grpc/) for more information.
#### topology

View File

@ -9,7 +9,7 @@ description: "Detailed documentation on the Zeebe JobWorker binding component"
To setup Zeebe JobWorker binding create a component of type `bindings.zeebe.jobworker`. See [this guide]({{< ref "howto-bindings.md#1-create-a-binding" >}}) on how to create and apply a binding configuration.
See [this](https://docs.camunda.io/docs/product-manuals/concepts/job-workers) for Zeebe JobWorker documentation.
See [this](https://docs.camunda.io/docs/components/concepts/job-workers/) for Zeebe JobWorker documentation.
```yaml
apiVersion: dapr.io/v1alpha1
@ -46,6 +46,8 @@ spec:
value: 0.3
- name: fetchVariables
value: productId, productName, productKey
- name: autocomplete
value: true
```
## Spec metadata fields
@ -65,6 +67,7 @@ spec:
| pollInterval | N | Input | Set the maximal interval between polling for new jobs. Defaults to 100 milliseconds | `100ms` |
| pollThreshold | N | Input | Set the threshold of buffered activated jobs before polling for new jobs, i.e. threshold * maxJobsActive. Defaults to 0.3 | `0.3` |
| fetchVariables | N | Input | A list of variables to fetch as the job variables; if empty, all visible variables at the time of activation for the scope of the job will be returned | `productId, productName, productKey` |
| autocomplete | N | Input | Indicates if a job should be autocompleted or not. If not set, all jobs will be auto-completed by default. Disable it if the worker should manually complete or fail the job with either a business error or an incident | `true,false` |
## Binding support
@ -96,8 +99,8 @@ Note: if the `fetchVariables` metadata field will not be passed, all process var
#### Headers
The Zeebe process engine has the ability to pass custom task headers to a job worker. These headers can be defined for every
[service task](https://stage.docs.zeebe.io/bpmn-workflows/service-tasks/service-tasks.html). Task headers will be passed
by the binding as metadata (HTTP headers) to the job worker.
[service task](https://docs.camunda.io/docs/components/best-practices/development/service-integration-patterns/#service-task).
Task headers will be passed by the binding as metadata (HTTP headers) to the job worker.
The binding will also pass the following job related variables as metadata. The values will be passed as string. The table contains also the
original data type so that it can be converted back to the equivalent data type in the used programming language for the worker.