The documentation you are viewing is for Dapr {{ . | markdownify }}
which is an older version of Dapr.
{{ with $latest_version }}For up-to-date documentation, see the
- latest version.
{{ end }}
{{ end }}
From 581a7f5aeae289974e72da5a62dbb3adaea9bfa9 Mon Sep 17 00:00:00 2001
From: Scott Hussey
Date: Tue, 28 Dec 2021 14:53:18 -0600
Subject: [PATCH 04/73] Document Kafka pub/sub OIDC authentication
- Add documentation to configure Kafka pub/sub component
for OIDC authentication
- Update documentation on the Kafka pub/sub component TLS
configuration to clarify a separation of TLS for transport
and authentication.
Signed-off-by: Scott Hussey
---
.../supported-pubsub/setup-apache-kafka.md | 229 +++++++++++++++---
1 file changed, 201 insertions(+), 28 deletions(-)
diff --git a/daprdocs/content/en/reference/components-reference/supported-pubsub/setup-apache-kafka.md b/daprdocs/content/en/reference/components-reference/supported-pubsub/setup-apache-kafka.md
index e0e4167c9..616bd31d6 100644
--- a/daprdocs/content/en/reference/components-reference/supported-pubsub/setup-apache-kafka.md
+++ b/daprdocs/content/en/reference/components-reference/supported-pubsub/setup-apache-kafka.md
@@ -27,11 +27,11 @@ spec:
value: "group1"
- name: clientID # Optional. Used as client tracing ID by Kafka brokers.
value: "my-dapr-app-id"
- - name: authRequired # Required.
- value: "true"
- - name: saslUsername # Required if authRequired is `true`.
+ - name: authType # Required.
+ value: "password"
+ - name: saslUsername # Required if authType is `password`.
value: "adminuser"
- - name: saslPassword # Required if authRequired is `true`.
+ - name: saslPassword # Required if authType is `password`.
secretKeyRef:
name: kafka-secrets
key: saslPasswordSecret
@@ -50,22 +50,158 @@ spec:
| brokers | Y | A comma-separated list of Kafka brokers. | `"localhost:9092,dapr-kafka.myapp.svc.cluster.local:9093"`
| consumerGroup | N | A kafka consumer group to listen on. Each record published to a topic is delivered to one consumer within each consumer group subscribed to the topic. | `"group1"`
| clientID | N | A user-provided string sent with every request to the Kafka brokers for logging, debugging, and auditing purposes. Defaults to `"sarama"`. | `"my-dapr-app"`
-| authRequired | Y | Enable [SASL](https://en.wikipedia.org/wiki/Simple_Authentication_and_Security_Layer) authentication with the Kafka brokers. | `"true"`, `"false"`
-| saslUsername | N | The SASL username used for authentication. Only required if `authRequired` is set to `"true"`. | `"adminuser"`
-| saslPassword | N | The SASL password used for authentication. Can be `secretKeyRef` to use a [secret reference]({{< ref component-secrets.md >}}). Only required if `authRequired` is set to `"true"`. | `""`, `"KeFg23!"`
+| authRequired | N | *Deprecated* Enable [SASL](https://en.wikipedia.org/wiki/Simple_Authentication_and_Security_Layer) authentication with the Kafka brokers. | `"true"`, `"false"`
+| authType | Y | Configure or disable authentication. Supported values: `none`, `password`, `mtls`, or `oidc` | `"password"`, `"none"`
+| saslUsername | N | The SASL username used for authentication. Only required if `authType` is set to `"password"`. | `"adminuser"`
+| saslPassword | N | The SASL password used for authentication. Can be `secretKeyRef` to use a [secret reference]({{< ref component-secrets.md >}}). Only required if `authType is set to `"password"`. | `""`, `"KeFg23!"`
| initialOffset | N | The initial offset to use if no offset was previously committed. Should be "newest" or "oldest". Defaults to "newest". | `"oldest"`
| maxMessageBytes | N | The maximum size in bytes allowed for a single Kafka message. Defaults to 1024. | `2048`
| consumeRetryInterval | N | The interval between retries when attempting to consume topics. Treats numbers without suffix as milliseconds. Defaults to 100ms. | `200ms`
| version | N | Kafka cluster version. Defaults to 2.0.0.0 | `0.10.2.0`
| caCert | N | Certificate authority certificate, required for using TLS. Can be `secretKeyRef` to use a secret reference | `"-----BEGIN CERTIFICATE-----\n\n-----END CERTIFICATE-----"`
-| clientCert | N | Client certificate, required for using TLS. Can be `secretKeyRef` to use a secret reference | `"-----BEGIN CERTIFICATE-----\n\n-----END CERTIFICATE-----"`
-| clientKey | N | Client key, required for using TLS. Can be `secretKeyRef` to use a secret reference | `"-----BEGIN RSA PRIVATE KEY-----\n\n-----END RSA PRIVATE KEY-----"`
+| clientCert | N | Client certificate, required for `authType` `mtls`. Can be `secretKeyRef` to use a secret reference | `"-----BEGIN CERTIFICATE-----\n\n-----END CERTIFICATE-----"`
+| clientKey | N | Client key, required for `authType` `mtls` Can be `secretKeyRef` to use a secret reference | `"-----BEGIN RSA PRIVATE KEY-----\n\n-----END RSA PRIVATE KEY-----"`
| skipVerify | N | Skip TLS verification, this is not recommended for use in production. Defaults to `"false"` | `"true"`, `"false"` |
+| disableTls | N | Disable TLS for transport security. This is not recommended for use in production. Defaults to `"false"` | `"true"`, `"false"` |
+| oidcTokenEndpoint | N | Full URL to an OAuth2 identity provider access token endpoint. Required when `authType` is set to `oidc` | "https://identity.example.com/v1/token" |
+| oidcClientID | N | The OAuth2 client ID that has been provisioned in the identity provider. Required when `authType is set to `oidc` | `dapr-kafka` |
+| oidcClientSecret | N | The OAuth2 client secret that has been provisioned in the identity provider: Required when `authType` is set to `oidc` | `"KeFg23!"` |
+| oidcScopes | N | Comma-delimited list of OAuth2/OIDC scopes to request with the access token. Recommended when `authType` is set to `oidc`. Defaults to `"openid"` | '"openid,kafka-prod"` |
-### Communication using TLS
-To configure communication using TLS, ensure the Kafka broker is configured to support certificates.
-Pre-requisite includes `certficate authority certificate`, `ca issued client certificate`, `client private key`.
-Below is an example of a Kafka pubsub component configured to use TLS:
+
+The `secretKeyRef` above is referencing a [kubernetes secrets store]({{< ref kubernetes-secret-store.md >}}) to access the tls information. Visit [here]({{< ref setup-secret-store.md >}}) to learn more about how to configure a secret store component.
+
+### Authentication
+
+Kafka supports a variety of authentication schemes and Dapr supports several: SASL password, mTLS, OIDC/OAuth2. With the added authentication methods, the `authRequired` field has been deprecated
+and instead the `authType` field should be used. If `authRequired` is set to `true`, Dapr will attempt to configure `authType` correctly based on the value of `saslPassword`. There are four valid values for `authType`: `none`, `password`, `mtls`, and `oidc`. Note this is authentication only; authorization is still configured within Kafka.
+
+#### None
+
+Setting `authType` to `none` will disable any authentication. This is *NOT* recommended in production.
+
+```yaml
+apiVersion: dapr.io/v1alpha1
+kind: Component
+metadata:
+ name: kafka-pubsub-noauth
+ namespace: default
+spec:
+ type: pubsub.kafka
+ version: v1
+ metadata:
+ - name: brokers # Required. Kafka broker connection setting
+ value: "dapr-kafka.myapp.svc.cluster.local:9092"
+ - name: consumerGroup # Optional. Used for input bindings.
+ value: "group1"
+ - name: clientID # Optional. Used as client tracing ID by Kafka brokers.
+ value: "my-dapr-app-id"
+ - name: authType # Required.
+ value: "none"
+ - name: maxMessageBytes # Optional.
+ value: 1024
+ - name: consumeRetryInterval # Optional.
+ value: 200ms
+ - name: version # Optional.
+ value: 0.10.2.0
+ - name: disableTls
+ value: "true"
+```
+
+#### SASL Password
+
+Setting `authType` to `password` will enable [SASL](https://en.wikipedia.org/wiki/Simple_Authentication_and_Security_Layer) authentication using the **PLAIN** mechanism. This requires setting
+the `saslUsername` and `saslPassword` fields.
+
+```yaml
+apiVersion: dapr.io/v1alpha1
+kind: Component
+metadata:
+ name: kafka-pubsub-sasl
+ namespace: default
+spec:
+ type: pubsub.kafka
+ version: v1
+ metadata:
+ - name: brokers # Required. Kafka broker connection setting
+ value: "dapr-kafka.myapp.svc.cluster.local:9092"
+ - name: consumerGroup # Optional. Used for input bindings.
+ value: "group1"
+ - name: clientID # Optional. Used as client tracing ID by Kafka brokers.
+ value: "my-dapr-app-id"
+ - name: authType # Required.
+ value: "password"
+ - name: saslUsername # Required if authType is `password`.
+ value: "adminuser"
+ - name: saslPassword # Required if authType is `password`.
+ secretKeyRef:
+ name: kafka-secrets
+ key: saslPasswordSecret
+ - name: maxMessageBytes # Optional.
+ value: 1024
+ - name: consumeRetryInterval # Optional.
+ value: 200ms
+ - name: version # Optional.
+ value: 0.10.2.0
+ - name: caCert
+ secretKeyRef:
+ name: kafka-tls
+ key: caCert
+```
+
+#### Mutual TLS
+
+Setting `authType` to `mtls` will use a x509 client certificate (the `clientCert` field) and key (the `clientKey` field) to authenticate. Note that mTLS as an
+authentication mechanism is distinct from using TLS to secure the transport layer via encryption. mTLS requires TLS transport (meaning `disableTls` must be `false`), but securing
+the transport layer does not require using mTLS. See _Communication using TLS_ for configuring underlying TLS transport.
+
+```yaml
+apiVersion: dapr.io/v1alpha1
+kind: Component
+metadata:
+ name: kafka-pubsub-mtls
+ namespace: default
+spec:
+ type: pubsub.kafka
+ version: v1
+ metadata:
+ - name: brokers # Required. Kafka broker connection setting
+ value: "dapr-kafka.myapp.svc.cluster.local:9092"
+ - name: consumerGroup # Optional. Used for input bindings.
+ value: "group1"
+ - name: clientID # Optional. Used as client tracing ID by Kafka brokers.
+ value: "my-dapr-app-id"
+ - name: authType # Required.
+ value: "mtls"
+ - name: caCert
+ secretKeyRef:
+ name: kafka-tls
+ key: caCert
+ - name: clientCert
+ secretKeyRef:
+ name: kafka-tls
+ key: clientCert
+ - name: clientKey
+ secretKeyRef:
+ name: kafka-tls
+ key: clientKey
+ - name: maxMessageBytes # Optional.
+ value: 1024
+ - name: consumeRetryInterval # Optional.
+ value: 200ms
+ - name: version # Optional.
+ value: 0.10.2.0
+```
+
+#### OAuth2 or OpenID Connect
+
+Setting `authType` to `oidc` will enable SASL authentication via the **OAUTHBEARER** mechanism. This supports specifying a bearer
+token from an external OAuth2 or [OIDC](https://en.wikipedia.org/wiki/OpenID) identity provider. Currenly only the **client_credentials** grant is supported. Configure `oidcTokenEndpoint` to
+the full URL for the identity provider access token endpoint. Set `oidcClientID` and `oidcClientSecret` to the client credentials provisioned in the identity provider. If `caCert`
+is specified in the component configuration, the certificate will be appended to the system CA trust for verifying the identity provider certificate. Similarly, if `skipVerify`
+is specified in the component configuration, it will also be applied when accessing the identity provider. By default, the only scope requested for the token is `openid` but it is highly recommended
+that additional scopes be specified via `oidcScopes` in a comma-separated list and validated by the Kafka broken. If additional scopes are not used to narrow the validity of the access token,
+a compromised Kafka broker could replay the token to access other services as the Dapr clientID.
```yaml
apiVersion: dapr.io/v1alpha1
@@ -83,9 +219,57 @@ spec:
value: "group1"
- name: clientID # Optional. Used as client tracing ID by Kafka brokers.
value: "my-dapr-app-id"
- - name: authRequired # Required.
- value: "true"
- - name: saslUsername # Required if authRequired is `true`.
+ - name: authType # Required.
+ value: "oidc"
+ - name: oidcTokenEndpoint # Required if authType is `oidc`.
+ value: "https://identity.example.com/v1/token"
+ - name: oidcClientID # Required if authType is `oidc`.
+ value: "dapr-myapp"
+ - name: oidcClientSecret # Required if authType is `oidc`.
+ secretKeyRef:
+ name: kafka-secrets
+ key: oidcClientSecret
+ - name: oidcScopes # Recommended if authType is `oidc`.
+ value: "openid,kafka-dev"
+ - name: caCert # Also applied to verifying OIDC provider certificate
+ secretKeyRef:
+ name: kafka-tls
+ key: caCert
+ - name: maxMessageBytes # Optional.
+ value: 1024
+ - name: consumeRetryInterval # Optional.
+ value: 200ms
+ - name: version # Optional.
+ value: 0.10.2.0
+```
+
+### Communication using TLS
+
+By default TLS is enabled to secure the transport layer to Kafka. To disable TLS, set `disableTls` to `true`. When TLS is enabled, you can
+control server certificate verification using `skipVerify` to disable verificaiton (*NOT* recommended in production environments) and `caCert` to
+specify a trusted TLS certificate authority (CA). If no `caCert` is specified, the system CA trust will be used. To also configure mTLS authentication,
+see the section under _Authentication_.
+Below is an example of a Kafka pubsub component configured to use transport layer TLS:
+
+```yaml
+apiVersion: dapr.io/v1alpha1
+kind: Component
+metadata:
+ name: kafka-pubsub
+ namespace: default
+spec:
+ type: pubsub.kafka
+ version: v1
+ metadata:
+ - name: brokers # Required. Kafka broker connection setting
+ value: "dapr-kafka.myapp.svc.cluster.local:9092"
+ - name: consumerGroup # Optional. Used for input bindings.
+ value: "group1"
+ - name: clientID # Optional. Used as client tracing ID by Kafka brokers.
+ value: "my-dapr-app-id"
+ - name: authType # Required.
+ value: "password"
+ - name: saslUsername # Required if authType is `password`.
value: "adminuser"
- name: consumeRetryInterval # Optional.
value: 200ms
@@ -101,21 +285,10 @@ spec:
secretKeyRef:
name: kafka-tls
key: caCert
- - name: clientCert # Client certificate.
- secretKeyRef:
- name: kafka-tls
- key: clientCert
- - name: clientKey # Client key.
- secretKeyRef:
- name: kafka-tls
- key: clientKey
auth:
secretStore:
```
-The `secretKeyRef` above is referencing a [kubernetes secrets store]({{< ref kubernetes-secret-store.md >}}) to access the tls information. Visit [here]({{< ref setup-secret-store.md >}}) to learn more about how to configure a secret store component.
-
-
## Per-call metadata fields
### Partition Key
@@ -154,4 +327,4 @@ To run Kafka on Kubernetes, you can use any Kafka operator, such as [Strimzi](ht
## Related links
- [Basic schema for a Dapr component]({{< ref component-schema >}})
- Read [this guide]({{< ref "howto-publish-subscribe.md##step-1-setup-the-pubsub-component" >}}) for instructions on configuring pub/sub components
-- [Pub/Sub building block]({{< ref pubsub >}})
\ No newline at end of file
+- [Pub/Sub building block]({{< ref pubsub >}})
From 7deff4f04e9989888a7680b8b9ef033455e6012a Mon Sep 17 00:00:00 2001
From: greenie-msft <56556602+greenie-msft@users.noreply.github.com>
Date: Tue, 28 Dec 2021 15:20:48 -0800
Subject: [PATCH 05/73] Upmerge v1.5 to v1.6 20211227 (#2064)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
* Added pub sub documentation
* Changed port number in the command
* Modified based on the review comments - 1
* Modified based on the review comments - 2
* Changed python commands
* Changed python commands
* Modified based on the review comments - 3
* Fix publish a topic examples
Step 3 and 4 of the "How-To: Publish & subscribe" publishes to the
wrong app id and pubsub. This commit fixes by changing the examples
that write to the app-id "testpubsub" to "orderprocessing"
and the pubsub "pubsub" to "order_pub_sub"
* Wrap oauth links in ignore comment
* Bumping tags to v1.5.0
* Update dapr versions for 1.5.1 hotfix
* Correct the default component init timeout
Signed-off-by: Amulya Varote
* Fix incorrect warning log-level
Signed-off-by: Amulya Varote
* Added bindings documentation
Signed-off-by: Amulya Varote
* Changed python commands
Signed-off-by: Amulya Varote
* Changed image for bindings
Signed-off-by: Amulya Varote
* Modified based on the review comments - 1
Signed-off-by: Amulya Varote
* Modified based on the review comments - 2
Signed-off-by: Amulya Varote
* Reverted a small change
Signed-off-by: Amulya Varote
* Added pub sub documentation
Signed-off-by: Amulya Varote
* Changed port number in the command
Signed-off-by: Amulya Varote
* Modified based on the review comments - 1
Signed-off-by: Amulya Varote
* Modified based on the review comments - 2
Signed-off-by: Amulya Varote
* Changed python commands
Signed-off-by: Amulya Varote
* Changed python commands
Signed-off-by: Amulya Varote
* Modified based on the review comments - 3
Signed-off-by: Amulya Varote
* Modified based on the review comments - 2
Signed-off-by: Amulya Varote
* Wrap oauth links in ignore comment
Signed-off-by: Amulya Varote
* Bumping tags to v1.5.0
Signed-off-by: Amulya Varote
* Modified based on the review comments -3
Signed-off-by: Amulya Varote
* Modified a line based on the review comment
Signed-off-by: Amulya Varote
* Fix publish a topic examples
Step 3 and 4 of the "How-To: Publish & subscribe" publishes to the
wrong app id and pubsub. This commit fixes by changing the examples
that write to the app-id "testpubsub" to "orderprocessing"
and the pubsub "pubsub" to "order_pub_sub"
Signed-off-by: Amulya Varote
* Update dapr versions for 1.5.1 hotfix
Signed-off-by: Amulya Varote
* Modified a line based on the review comment
Signed-off-by: Amulya Varote
* Modified based on the review comments
Signed-off-by: Amulya Varote
* Added pub sub documentation
Signed-off-by: Amulya Varote
* Changed port number in the command
Signed-off-by: Amulya Varote
* Modified based on the review comments - 1
Signed-off-by: Amulya Varote
* Added secrets store documentation
Signed-off-by: Amulya Varote
* Modified based on the review comments - 1
Signed-off-by: Amulya Varote
* Modified based on the review comments - 1
Signed-off-by: Amulya Varote
* Modified based on the review comments - 2
Signed-off-by: Amulya Varote
* Update daprdocs/content/en/developing-applications/building-blocks/secrets/howto-secrets.md
Signed-off-by: Amulya Varote
* Update daprdocs/content/en/developing-applications/building-blocks/secrets/howto-secrets.md
Signed-off-by: Amulya Varote
* Modified based on the review comments - 2
Signed-off-by: Amulya Varote
* Changed python commands
Signed-off-by: Amulya Varote
* Changed python commands
Signed-off-by: Amulya Varote
* Modified based on the review comments - 3
Signed-off-by: Amulya Varote
* Fix publish a topic examples
Step 3 and 4 of the "How-To: Publish & subscribe" publishes to the
wrong app id and pubsub. This commit fixes by changing the examples
that write to the app-id "testpubsub" to "orderprocessing"
and the pubsub "pubsub" to "order_pub_sub"
Signed-off-by: Amulya Varote
* Added full code snippets for service invocation
Signed-off-by: Amulya Varote
* Added full code snippets of secrets management
Signed-off-by: Amulya Varote
* Added full code snippets of state management
Signed-off-by: Amulya Varote
* Added full code snippets of pub sub
Signed-off-by: Amulya Varote
* Modified based on the review comments - 1
Signed-off-by: Amulya Varote
* Removed a line
Signed-off-by: Amulya Varote
* resolved merge conflicts
Signed-off-by: Amulya Varote
* Modified based on the review comments - 1
Signed-off-by: Amulya Varote
* Added full code snippets of secrets management
Signed-off-by: Amulya Varote
* Added full code snippets of pub sub
Signed-off-by: Amulya Varote
* Modified based on the review comments - 1
Signed-off-by: Amulya Varote
* Removed a line
Signed-off-by: Amulya Varote
* Added full code snippets of secrets management
Signed-off-by: Amulya Varote
* Added full code snippets of pub sub
Signed-off-by: Amulya Varote
* Modified based on the review comments - 1
Signed-off-by: Amulya Varote
* Solved merge conflicts
Signed-off-by: Amulya Varote
* Modified go secrets code
Signed-off-by: Amulya Varote
* Update daprdocs/content/en/developing-applications/building-blocks/secrets/howto-secrets.md
Co-authored-by: Will
* Added DCO sign-off guidance for contributing to Dapr
Signed-off-by: Nick Greenfield
* Address PR feedback
Signed-off-by: Nick Greenfield
* Add DCO check to PR template
Signed-off-by: Nick Greenfield
* simplify git command for DCO
Signed-off-by: Will Tsai
* Fix header
* Display state value instead of StateResponse object (#2046)
Signed-off-by: Jun Tian
Co-authored-by: greenie-msft <56556602+greenie-msft@users.noreply.github.com>
* add ForcePathStyle support for s3 binding (#2054)
Signed-off-by: rainfd
Co-authored-by: greenie-msft <56556602+greenie-msft@users.noreply.github.com>
Co-authored-by: Mark Fussell
* Remove requests dep (#2055)
* Add DCO check to PR template
Signed-off-by: Nick Greenfield
* Remove requests dep
Signed-off-by: GitHub
Co-authored-by: Nick Greenfield
Co-authored-by: Yaron Schneider
Co-authored-by: Mark Fussell
* update endpoint in s3 binding doc (#2049)
* update endpoint in s3 binding doc
Signed-off-by: lizzzcai
* fix jaeger link
Signed-off-by: lizzzcai
* update endpoint example to match region
Signed-off-by: lizzzcai
Co-authored-by: Mark Fussell
* Update autoscale-keda.md (#2047)
Co-authored-by: greenie-msft <56556602+greenie-msft@users.noreply.github.com>
Co-authored-by: Mark Fussell
* Update service-mesh.md (#2044)
Minor typo
Co-authored-by: greenie-msft <56556602+greenie-msft@users.noreply.github.com>
Co-authored-by: Mark Fussell
* Added missing HTTP verb to the documentation dapr/dapr#4030 (#2045)
Signed-off-by: GitHub
Co-authored-by: Mark Fussell
* Update logs-troubleshooting.md (#2023)
The label name `dapr-placement` has been changed to `dapr-placement-server`.
Co-authored-by: Yaron Schneider
Co-authored-by: Mark Fussell
* update the DOTNET SDK
Signed-off-by: Mark Fussell
Co-authored-by: Amulya Varote
Co-authored-by: Amulya Varote
Co-authored-by: Amulya Varote
Co-authored-by: David Millman
Co-authored-by: emctl
Co-authored-by: Bernd Verst <4535280+berndverst@users.noreply.github.com>
Co-authored-by: Will
Co-authored-by: Will Tsai
Co-authored-by: Aaron Crawfis
Co-authored-by: Jun Tian
Co-authored-by: RainFD
Co-authored-by: Mark Fussell
Co-authored-by: ChethanUK
Co-authored-by: Yaron Schneider
Co-authored-by: Lize
Co-authored-by: Andrew Cartwright
Co-authored-by: Shriroop Joshi
Co-authored-by: Christoph Hofmann
Co-authored-by: Will 保哥
Co-authored-by: Mark Fussell
---
.github/pull_request_template.md | 2 +-
daprdocs/content/en/concepts/service-mesh.md | 2 +-
.../en/contributing/contributing-overview.md | 40 +-
.../bindings/howto-bindings.md | 273 +++++-
.../bindings/howto-triggers.md | 263 +++++-
.../pubsub/howto-publish-subscribe.md | 801 +++++++++++-------
.../building-blocks/secrets/howto-secrets.md | 69 +-
.../howto-invoke-discover-services.md | 159 ++--
.../state-management/howto-get-save-state.md | 346 ++++----
.../integrations/autoscale-keda.md | 2 +-
.../content/en/getting-started/quickstarts.md | 18 +-
.../supported-tracing-backends/jaeger.md | 2 +-
.../content/en/operations/security/oauth.md | 2 +
.../support/support-release-policy.md | 29 +-
.../troubleshooting/logs-troubleshooting.md | 42 +-
.../reference/api/service_invocation_api.md | 2 +-
.../supported-bindings/s3.md | 10 +
.../shortcodes/dapr-latest-version.html | 2 +-
.../building-block-input-binding-example.png | Bin 0 -> 88366 bytes
.../building-block-output-binding-example.png | Bin 0 -> 95677 bytes
.../images/building-block-pub-sub-example.png | Bin 0 -> 171404 bytes
sdkdocs/dotnet | 2 +-
22 files changed, 1425 insertions(+), 641 deletions(-)
create mode 100644 daprdocs/static/images/building-block-input-binding-example.png
create mode 100644 daprdocs/static/images/building-block-output-binding-example.png
create mode 100644 daprdocs/static/images/building-block-pub-sub-example.png
diff --git a/.github/pull_request_template.md b/.github/pull_request_template.md
index aa50e1dc1..a6b77f550 100644
--- a/.github/pull_request_template.md
+++ b/.github/pull_request_template.md
@@ -1,7 +1,7 @@
Thank you for helping make the Dapr documentation better!
**Please follow this checklist before submitting:**
-
+- [ ] Commits are signed with Developer Certificate of Origin (DCO - [learn more](https://docs.dapr.io/contributing/contributing-overview/#developer-certificate-of-origin-signing-your-work))
- [ ] [Read the contribution guide](https://docs.dapr.io/contributing/contributing-docs/)
- [ ] Commands include options for Linux, MacOS, and Windows within codetabs
- [ ] New file and folder names are globally unique
diff --git a/daprdocs/content/en/concepts/service-mesh.md b/daprdocs/content/en/concepts/service-mesh.md
index 1c81221f7..96613a319 100644
--- a/daprdocs/content/en/concepts/service-mesh.md
+++ b/daprdocs/content/en/concepts/service-mesh.md
@@ -7,7 +7,7 @@ description: >
How Dapr compares to and works with service meshes
---
-Dapr uses a sidecar architecture, running as a separate process alongside the application and includes features such as service invocation, network security, and distributed tracing. This often raises the question: how does Dapr compare to service mesh solutions such as [Linkerd](https://linkerd.io/), [Istio](https://istio.io/) and [Open Service Mesh](https://openservicemesh.io/) amoung others?
+Dapr uses a sidecar architecture, running as a separate process alongside the application and includes features such as service invocation, network security, and distributed tracing. This often raises the question: how does Dapr compare to service mesh solutions such as [Linkerd](https://linkerd.io/), [Istio](https://istio.io/) and [Open Service Mesh](https://openservicemesh.io/) among others?
## How Dapr and service meshes compare
While Dapr and service meshes do offer some overlapping capabilities, **Dapr is not a service mesh**, where a service mesh is defined as a *networking* service mesh. Unlike a service mesh which is focused on networking concerns, Dapr is focused on providing building blocks that make it easier for developers to build applications as microservices. Dapr is developer-centric, versus service meshes which are infrastructure-centric.
diff --git a/daprdocs/content/en/contributing/contributing-overview.md b/daprdocs/content/en/contributing/contributing-overview.md
index 6a8a404b3..69bb68c2f 100644
--- a/daprdocs/content/en/contributing/contributing-overview.md
+++ b/daprdocs/content/en/contributing/contributing-overview.md
@@ -43,6 +43,7 @@ Before you submit an issue, make sure you've checked the following:
- Many changes to the Dapr runtime may require changes to the API. In that case, the best place to discuss the potential feature is the main [Dapr repo](https://github.com/dapr/dapr).
- Other examples could include bindings, state stores or entirely new components.
+
## Pull Requests
All contributions come through pull requests. To submit a proposed change, follow this workflow:
@@ -53,18 +54,53 @@ All contributions come through pull requests. To submit a proposed change, follo
1. Create your change
- Code changes require tests
1. Update relevant documentation for the change
-1. Commit and open a PR
+1. Commit with [DCO sign-off]({{< ref "contributing-overview.md#developer-certificate-of-origin-signing-your-work" >}}) and open a PR
1. Wait for the CI process to finish and make sure all checks are green
1. A maintainer of the project will be assigned, and you can expect a review within a few days
+
#### Use work-in-progress PRs for early feedback
A good way to communicate before investing too much time is to create a "Work-in-progress" PR and share it with your reviewers. The standard way of doing this is to add a "[WIP]" prefix in your PR's title and assign the **do-not-merge** label. This will let people looking at your PR know that it is not well baked yet.
-### Use of Third-party code
+## Use of Third-party code
- Third-party code must include licenses.
+## Developer Certificate of Origin: Signing your work
+#### Every commit needs to be signed
+
+The Developer Certificate of Origin (DCO) is a lightweight way for contributors to certify that they wrote or otherwise have the right to submit the code they are contributing to the project. Here is the full text of the [DCO](https://developercertificate.org/), reformatted for readability:
+```
+By making a contribution to this project, I certify that:
+ (a) The contribution was created in whole or in part by me and I have the right to submit it under the open source license indicated in the file; or
+ (b) The contribution is based upon previous work that, to the best of my knowledge, is covered under an appropriate open source license and I have the right under that license to submit that work with modifications, whether created in whole or in part by me, under the same open source license (unless I am permitted to submit under a different license), as indicated in the file; or
+ (c) The contribution was provided directly to me by some other person who certified (a), (b) or (c) and I have not modified it.
+ (d) I understand and agree that this project and the contribution are public and that a record of the contribution (including all personal information I submit with it, including my sign-off) is maintained indefinitely and may be redistributed consistent with this project or the open source license(s) involved.
+```
+Contributors sign-off that they adhere to these requirements by adding a `Signed-off-by` line to commit messages.
+
+```
+This is my commit message
+Signed-off-by: Random J Developer
+```
+Git even has a `-s` command line option to append this automatically to your commit message:
+```
+$ git commit -s -m 'This is my commit message'
+```
+
+Each Pull Request is checked whether or not commits in a Pull Request do contain a valid Signed-off-by line.
+
+#### I didn't sign my commit, now what?!
+
+No worries - You can easily replay your changes, sign them and force push them!
+
+```
+git checkout
+git commit --amend --no-edit --signoff
+git push --force-with-lease
+```
+
## Code of Conduct
Please see the [Dapr community code of conduct](https://github.com/dapr/community/blob/master/CODE-OF-CONDUCT.md).
diff --git a/daprdocs/content/en/developing-applications/building-blocks/bindings/howto-bindings.md b/daprdocs/content/en/developing-applications/building-blocks/bindings/howto-bindings.md
index ea45ababf..ce9c1d87a 100644
--- a/daprdocs/content/en/developing-applications/building-blocks/bindings/howto-bindings.md
+++ b/daprdocs/content/en/developing-applications/building-blocks/bindings/howto-bindings.md
@@ -9,11 +9,11 @@ weight: 300
Output bindings enable you to invoke external resources without taking dependencies on special SDK or libraries.
For a complete sample showing output bindings, visit this [link](https://github.com/dapr/quickstarts/tree/master/bindings).
-Watch this [video](https://www.youtube.com/watch?v=ysklxm81MTs&feature=youtu.be&t=1960) on how to use bi-directional output bindings.
+## Example:
-
-
-
+The below code example loosely describes an application that processes orders. In the example, there is an order processing service which has a Dapr sidecar. The order processing service uses Dapr to invoke external resources, in this case a Kafka, via an output binding.
+
+
## 1. Create a binding
@@ -21,7 +21,7 @@ An output binding represents a resource that Dapr uses to invoke and send messag
For the purpose of this guide, you'll use a Kafka binding. You can find a list of the different binding specs [here]({{< ref setup-bindings >}}).
-Create a new binding component with the name of `myevent`.
+Create a new binding component with the name of `checkout`.
Inside the `metadata` section, configure Kafka related properties such as the topic to publish the message to and the broker.
@@ -36,16 +36,24 @@ Create the following YAML file, named `binding.yaml`, and save this to a `compon
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
- name: myevent
- namespace: default
+ name: checkout
spec:
type: bindings.kafka
version: v1
metadata:
+ # Kafka broker connection setting
- name: brokers
value: localhost:9092
+ # consumer configuration: topic and consumer group
+ - name: topics
+ value: sample
+ - name: consumerGroup
+ value: group1
+ # publisher configuration: topic
- name: publishTopic
- value: topic1
+ value: sample
+ - name: authRequired
+ value: "false"
```
{{% /codetab %}}
@@ -59,40 +67,273 @@ To deploy this into a Kubernetes cluster, fill in the `metadata` connection deta
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
- name: myevent
- namespace: default
+ name: checkout
spec:
type: bindings.kafka
version: v1
metadata:
+ # Kafka broker connection setting
- name: brokers
value: localhost:9092
+ # consumer configuration: topic and consumer group
+ - name: topics
+ value: sample
+ - name: consumerGroup
+ value: group1
+ # publisher configuration: topic
- name: publishTopic
- value: topic1
+ value: sample
+ - name: authRequired
+ value: "false"
```
{{% /codetab %}}
{{< /tabs >}}
-## 2. Send an event
+## 2. Send an event (Output binding)
+
+Below are code examples that leverage Dapr SDKs to interact with an output binding.
+
+{{< tabs Dotnet Java Python Go Javascript>}}
+
+{{% codetab %}}
+
+```csharp
+//dependencies
+using System;
+using System.Collections.Generic;
+using System.Net.Http;
+using System.Net.Http.Headers;
+using System.Threading.Tasks;
+using Dapr.Client;
+using Microsoft.AspNetCore.Mvc;
+using System.Threading;
+
+//code
+namespace EventService
+{
+ class Program
+ {
+ static async Task Main(string[] args)
+ {
+ string BINDING_NAME = "checkout";
+ string BINDING_OPERATION = "create";
+ while(true) {
+ System.Threading.Thread.Sleep(5000);
+ Random random = new Random();
+ int orderId = random.Next(1,1000);
+ using var client = new DaprClientBuilder().Build();
+ //Using Dapr SDK to invoke output binding
+ await client.InvokeBindingAsync(BINDING_NAME, BINDING_OPERATION, orderId);
+ Console.WriteLine("Sending message: " + orderId);
+ }
+ }
+ }
+}
+
+```
+
+Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
+
+```bash
+dapr run --app-id orderprocessing --app-port 6001 --dapr-http-port 3601 --dapr-grpc-port 60001 --app-ssl dotnet run
+```
+
+{{% /codetab %}}
+
+{{% codetab %}}
+
+```java
+//dependencies
+import io.dapr.client.DaprClient;
+import io.dapr.client.DaprClientBuilder;
+import io.dapr.client.domain.HttpExtension;
+import org.springframework.boot.autoconfigure.SpringBootApplication;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+import java.util.Random;
+import java.util.concurrent.TimeUnit;
+
+//code
+@SpringBootApplication
+public class OrderProcessingServiceApplication {
+
+ private static final Logger log = LoggerFactory.getLogger(OrderProcessingServiceApplication.class);
+
+ public static void main(String[] args) throws InterruptedException{
+ String BINDING_NAME = "checkout";
+ String BINDING_OPERATION = "create";
+ while(true) {
+ TimeUnit.MILLISECONDS.sleep(5000);
+ Random random = new Random();
+ int orderId = random.nextInt(1000-1) + 1;
+ DaprClient client = new DaprClientBuilder().build();
+ //Using Dapr SDK to invoke output binding
+ client.invokeBinding(BINDING_NAME, BINDING_OPERATION, orderId).block();
+ log.info("Sending message: " + orderId);
+ }
+ }
+}
+
+```
+
+Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
+
+```bash
+dapr run --app-id orderprocessing --app-port 6001 --dapr-http-port 3601 --dapr-grpc-port 60001 mvn spring-boot:run
+```
+
+{{% /codetab %}}
+
+{{% codetab %}}
+
+```python
+#dependencies
+import random
+from time import sleep
+import requests
+import logging
+import json
+from dapr.clients import DaprClient
+
+#code
+logging.basicConfig(level = logging.INFO)
+BINDING_NAME = 'checkout'
+BINDING_OPERATION = 'create'
+while True:
+ sleep(random.randrange(50, 5000) / 1000)
+ orderId = random.randint(1, 1000)
+ with DaprClient() as client:
+ #Using Dapr SDK to invoke output binding
+ resp = client.invoke_binding(BINDING_NAME, BINDING_OPERATION, json.dumps(orderId))
+ logging.basicConfig(level = logging.INFO)
+ logging.info('Sending message: ' + str(orderId))
+
+```
+
+Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
+
+```bash
+dapr run --app-id orderprocessing --app-port 6001 --dapr-http-port 3601 --app-protocol grpc python3 OrderProcessingService.py
+```
+
+{{% /codetab %}}
+
+{{% codetab %}}
+
+```go
+//dependencies
+import (
+ "context"
+ "log"
+ "math/rand"
+ "time"
+ "strconv"
+ dapr "github.com/dapr/go-sdk/client"
+
+)
+
+//code
+func main() {
+ BINDING_NAME := "checkout";
+ BINDING_OPERATION := "create";
+ for i := 0; i < 10; i++ {
+ time.Sleep(5000)
+ orderId := rand.Intn(1000-1) + 1
+ client, err := dapr.NewClient()
+ if err != nil {
+ panic(err)
+ }
+ defer client.Close()
+ ctx := context.Background()
+ //Using Dapr SDK to invoke output binding
+ in := &dapr.InvokeBindingRequest{ Name: BINDING_NAME, Operation: BINDING_OPERATION , Data: []byte(strconv.Itoa(orderId))}
+ err = client.InvokeOutputBinding(ctx, in)
+ log.Println("Sending message: " + strconv.Itoa(orderId))
+ }
+}
+
+```
+
+Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
+
+```bash
+dapr run --app-id orderprocessing --app-port 6001 --dapr-http-port 3601 --dapr-grpc-port 60001 go run OrderProcessingService.go
+```
+
+{{% /codetab %}}
+
+{{% codetab %}}
+
+```javascript
+//dependencies
+
+import { DaprServer, DaprClient, CommunicationProtocolEnum } from 'dapr-client';
+
+//code
+const daprHost = "127.0.0.1";
+
+var main = function() {
+ for(var i=0;i<10;i++) {
+ sleep(5000);
+ var orderId = Math.floor(Math.random() * (1000 - 1) + 1);
+ start(orderId).catch((e) => {
+ console.error(e);
+ process.exit(1);
+ });
+ }
+}
+
+async function start(orderId) {
+ const BINDING_NAME = "checkout";
+ const BINDING_OPERATION = "create";
+ const client = new DaprClient(daprHost, process.env.DAPR_HTTP_PORT, CommunicationProtocolEnum.HTTP);
+ //Using Dapr SDK to invoke output binding
+ const result = await client.binding.send(BINDING_NAME, BINDING_OPERATION, { orderId: orderId });
+ console.log("Sending message: " + orderId);
+}
+
+function sleep(ms) {
+ return new Promise(resolve => setTimeout(resolve, ms));
+}
+
+main();
+
+```
+
+Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
+
+```bash
+dapr run --app-id orderprocessing --app-port 6001 --dapr-http-port 3601 --dapr-grpc-port 60001 npm start
+```
+
+{{% /codetab %}}
+
+{{< /tabs >}}
All that's left now is to invoke the output bindings endpoint on a running Dapr instance.
-You can do so using HTTP:
+You can also invoke the output bindings endpoint using HTTP:
```bash
-curl -X POST -H 'Content-Type: application/json' http://localhost:3500/v1.0/bindings/myevent -d '{ "data": { "message": "Hi!" }, "operation": "create" }'
+curl -X POST -H 'Content-Type: application/json' http://localhost:3601/v1.0/bindings/checkout -d '{ "data": { "orderId": "100" }, "operation": "create" }'
```
-As seen above, you invoked the `/binding` endpoint with the name of the binding to invoke, in our case its `myevent`.
+As seen above, you invoked the `/binding` endpoint with the name of the binding to invoke, in our case its `checkout`.
The payload goes inside the mandatory `data` field, and can be any JSON serializable value.
You'll also notice that there's an `operation` field that tells the binding what you need it to do.
You can check [here]({{< ref supported-bindings >}}) which operations are supported for every output binding.
+Watch this [video](https://www.youtube.com/watch?v=ysklxm81MTs&feature=youtu.be&t=1960) on how to use bi-directional output bindings.
+
+
+
+
+
## References
- [Binding API]({{< ref bindings_api.md >}})
- [Binding components]({{< ref bindings >}})
-- [Binding detailed specifications]({{< ref supported-bindings >}})
\ No newline at end of file
+- [Binding detailed specifications]({{< ref supported-bindings >}})
diff --git a/daprdocs/content/en/developing-applications/building-blocks/bindings/howto-triggers.md b/daprdocs/content/en/developing-applications/building-blocks/bindings/howto-triggers.md
index c0afc13a8..ad3da9c6a 100644
--- a/daprdocs/content/en/developing-applications/building-blocks/bindings/howto-triggers.md
+++ b/daprdocs/content/en/developing-applications/building-blocks/bindings/howto-triggers.md
@@ -1,7 +1,7 @@
---
type: docs
title: "How-To: Trigger your application with input bindings"
-linkTitle: "How-To: Triggers"
+linkTitle: "How-To: Input bindings"
description: "Use Dapr input bindings to trigger event driven applications"
weight: 200
---
@@ -18,78 +18,270 @@ Dapr bindings allow you to:
For more info on bindings, read [this overview]({{}}).
-For a quickstart sample showing bindings, visit this [link](https://github.com/dapr/quickstarts/tree/master/bindings).
+## Example:
+
+The below code example loosely describes an application that processes orders. In the example, there is an order processing service which has a Dapr sidecar. The checkout service uses Dapr to trigger the application via an input binding.
+
+
## 1. Create a binding
-An input binding represents an event resource that Dapr uses to read events from and push to your application.
+An input binding represents a resource that Dapr uses to read events from and push to your application.
-For the purpose of this HowTo, we'll use a Kafka binding. You can find a list of the different binding specs [here]({{< ref supported-bindings >}}).
+For the purpose of this guide, you'll use a Kafka binding. You can find a list of supported binding components [here]({{< ref setup-bindings >}}).
-Create the following YAML file, named binding.yaml, and save this to a `components` sub-folder in your application directory.
+Create a new binding component with the name of `checkout`.
+
+Inside the `metadata` section, configure Kafka related properties, such as the topic to publish the message to and the broker.
+
+{{< tabs "Self-Hosted (CLI)" Kubernetes >}}
+
+{{% codetab %}}
+
+Create the following YAML file, named `binding.yaml`, and save this to a `components` sub-folder in your application directory.
(Use the `--components-path` flag with `dapr run` to point to your custom components dir)
-*Note: When running in Kubernetes, apply this file to your cluster using `kubectl apply -f binding.yaml`*
-
```yaml
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
- name: myevent
- namespace: default
+ name: checkout
spec:
type: bindings.kafka
version: v1
metadata:
- - name: topics
- value: topic1
+ # Kafka broker connection setting
- name: brokers
value: localhost:9092
+ # consumer configuration: topic and consumer group
+ - name: topics
+ value: sample
- name: consumerGroup
value: group1
+ # publisher configuration: topic
+ - name: publishTopic
+ value: sample
+ - name: authRequired
+ value: "false"
```
-Here, you create a new binding component with the name of `myevent`.
+{{% /codetab %}}
-Inside the `metadata` section, configure the Kafka related properties such as the topics to listen on, the brokers and more.
+{{% codetab %}}
-## 2. Listen for incoming events
+To deploy this into a Kubernetes cluster, fill in the `metadata` connection details of your [desired binding component]({{< ref setup-bindings >}}) in the yaml below (in this case kafka), save as `binding.yaml`, and run `kubectl apply -f binding.yaml`.
-Now configure your application to receive incoming events. If using HTTP, you need to listen on a `POST` endpoint with the name of the binding as specified in `metadata.name` in the file. In this example, this is `myevent`.
-*The following example shows how you would listen for the event in Node.js, but this is applicable to any programming language*
+```yaml
+apiVersion: dapr.io/v1alpha1
+kind: Component
+metadata:
+ name: checkout
+spec:
+ type: bindings.kafka
+ version: v1
+ metadata:
+ # Kafka broker connection setting
+ - name: brokers
+ value: localhost:9092
+ # consumer configuration: topic and consumer group
+ - name: topics
+ value: sample
+ - name: consumerGroup
+ value: group1
+ # publisher configuration: topic
+ - name: publishTopic
+ value: sample
+ - name: authRequired
+ value: "false"
+```
+
+{{% /codetab %}}
+
+{{< /tabs >}}
+
+## 2. Listen for incoming events (input binding)
+
+Now configure your application to receive incoming events. If using HTTP, you need to listen on a `POST` endpoint with the name of the binding as specified in `metadata.name` in the file.
+
+Below are code examples that leverage Dapr SDKs to demonstrate an output binding.
+
+{{< tabs Dotnet Java Python Go Javascript>}}
+
+{{% codetab %}}
+
+```csharp
+//dependencies
+using System.Collections.Generic;
+using System.Threading.Tasks;
+using System;
+using Microsoft.AspNetCore.Mvc;
+
+//code
+namespace CheckoutService.controller
+{
+ [ApiController]
+ public class CheckoutServiceController : Controller
+ {
+ [HttpPost("/checkout")]
+ public ActionResult getCheckout([FromBody] int orderId)
+ {
+ Console.WriteLine("Received Message: " + orderId);
+ return "CID" + orderId;
+ }
+ }
+}
+
+```
+
+Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
+
+```bash
+dapr run --app-id checkout --app-port 6002 --dapr-http-port 3602 --dapr-grpc-port 60002 --app-ssl dotnet run
+```
+
+{{% /codetab %}}
+
+{{% codetab %}}
+
+```java
+//dependencies
+import org.springframework.web.bind.annotation.*;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+import reactor.core.publisher.Mono;
+
+//code
+@RestController
+@RequestMapping("/")
+public class CheckoutServiceController {
+ private static final Logger log = LoggerFactory.getLogger(CheckoutServiceController.class);
+ @PostMapping(path = "/checkout")
+ public Mono getCheckout(@RequestBody(required = false) byte[] body) {
+ return Mono.fromRunnable(() ->
+ log.info("Received Message: " + new String(body)));
+ }
+}
+
+```
+
+Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
+
+```bash
+dapr run --app-id checkout --app-port 6002 --dapr-http-port 3602 --dapr-grpc-port 60002 mvn spring-boot:run
+```
+
+{{% /codetab %}}
+
+{{% codetab %}}
+
+```python
+#dependencies
+import logging
+from dapr.ext.grpc import App, BindingRequest
+
+#code
+app = App()
+
+@app.binding('checkout')
+def getCheckout(request: BindingRequest):
+ logging.basicConfig(level = logging.INFO)
+ logging.info('Received Message : ' + request.text())
+
+app.run(6002)
+
+```
+
+Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
+
+```bash
+dapr run --app-id checkout --app-port 6002 --dapr-http-port 3602 --app-protocol grpc -- python3 CheckoutService.py
+```
+
+{{% /codetab %}}
+
+{{% codetab %}}
+
+```go
+//dependencies
+import (
+ "encoding/json"
+ "log"
+ "net/http"
+ "github.com/gorilla/mux"
+)
+
+//code
+func getCheckout(w http.ResponseWriter, r *http.Request) {
+ w.Header().Set("Content-Type", "application/json")
+ var orderId int
+ err := json.NewDecoder(r.Body).Decode(&orderId)
+ log.Println("Received Message: ", orderId)
+ if err != nil {
+ log.Printf("error parsing checkout input binding payload: %s", err)
+ w.WriteHeader(http.StatusOK)
+ return
+ }
+}
+
+func main() {
+ r := mux.NewRouter()
+ r.HandleFunc("/checkout", getCheckout).Methods("POST", "OPTIONS")
+ http.ListenAndServe(":6002", r)
+}
+
+```
+
+Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
+
+```bash
+dapr run --app-id checkout --app-port 6002 --dapr-http-port 3602 --dapr-grpc-port 60002 go run CheckoutService.go
+```
+
+{{% /codetab %}}
+
+{{% codetab %}}
```javascript
-const express = require('express')
-const bodyParser = require('body-parser')
-const app = express()
-app.use(bodyParser.json())
+//dependencies
+import { DaprServer, CommunicationProtocolEnum } from 'dapr-client';
-const port = 3000
+//code
+const daprHost = "127.0.0.1";
+const serverHost = "127.0.0.1";
+const serverPort = "6002";
+const daprPort = "3602";
-app.post('/myevent', (req, res) => {
- console.log(req.body)
- res.status(200).send()
-})
+start().catch((e) => {
+ console.error(e);
+ process.exit(1);
+});
+
+async function start() {
+ const server = new DaprServer(serverHost, serverPort, daprHost, daprPort, CommunicationProtocolEnum.HTTP);
+ await server.binding.receive('checkout', async (orderId) => console.log(`Received Message: ${JSON.stringify(orderId)}`));
+ await server.startServer();
+}
-app.listen(port, () => console.log(`Kafka consumer app listening on port ${port}!`))
```
+Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
+
+```bash
+dapr run --app-id checkout --app-port 6002 --dapr-http-port 3602 --dapr-grpc-port 60002 dotnet npm start
+```
+
+{{% /codetab %}}
+
+{{< /tabs >}}
+
### ACK-ing an event
In order to tell Dapr that you successfully processed an event in your application, return a `200 OK` response from your HTTP handler.
-```javascript
-res.status(200).send()
-```
-
### Rejecting an event
-In order to tell Dapr that the event wasn't processed correctly in your application and schedule it for redelivery, return any response different from `200 OK`. For example, a `500 Error`.
-
-```javascript
-res.status(500).send()
-```
+In order to tell Dapr that the event was not processed correctly in your application and schedule it for redelivery, return any response other than `200 OK`. For example, a `500 Error`.
### Specifying a custom route
@@ -108,7 +300,6 @@ spec:
### Event delivery Guarantees
Event delivery guarantees are controlled by the binding implementation. Depending on the binding implementation, the event delivery can be exactly once or at least once.
-
## References
* [Bindings building block]({{< ref bindings >}})
diff --git a/daprdocs/content/en/developing-applications/building-blocks/pubsub/howto-publish-subscribe.md b/daprdocs/content/en/developing-applications/building-blocks/pubsub/howto-publish-subscribe.md
index ca2c4289d..5ace3cdb7 100644
--- a/daprdocs/content/en/developing-applications/building-blocks/pubsub/howto-publish-subscribe.md
+++ b/daprdocs/content/en/developing-applications/building-blocks/pubsub/howto-publish-subscribe.md
@@ -20,33 +20,48 @@ When publishing a message, it's important to specify the content type of the dat
Unless specified, Dapr will assume `text/plain`. When using Dapr's HTTP API, the content type can be set in a `Content-Type` header.
gRPC clients and SDKs have a dedicated content type parameter.
-## Step 1: Setup the Pub/Sub component
-The following example creates applications to publish and subscribe to a topic called `deathStarStatus`.
+## Example:
-
-
+The below code example loosely describes an application that processes orders. In the example, there are two services - an order processing service and a checkout service. Both services have Dapr sidecars. The order processing service uses Dapr to publish a message to RabbitMQ and the checkout service subscribes to the topic in the message queue.
+
+
+
+## Step 1: Setup the Pub/Sub component
+The following example creates applications to publish and subscribe to a topic called `orders`.
The first step is to setup the Pub/Sub component:
{{< tabs "Self-Hosted (CLI)" Kubernetes >}}
{{% codetab %}}
-Redis Streams is installed by default on a local machine when running `dapr init`.
+The pubsub.yaml is created by default on your local machine when running `dapr init`. Verify by opening your components file under `%UserProfile%\.dapr\components\pubsub.yaml` on Windows or `~/.dapr/components/pubsub.yaml` on Linux/MacOS.
+
+In this example, RabbitMQ is used for publish and subscribe. Replace `pubsub.yaml` file contents with the below contents.
-Verify by opening your components file under `%UserProfile%\.dapr\components\pubsub.yaml` on Windows or `~/.dapr/components/pubsub.yaml` on Linux/MacOS:
```yaml
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
- name: pubsub
+ name: order_pub_sub
spec:
- type: pubsub.redis
+ type: pubsub.rabbitmq
version: v1
metadata:
- - name: redisHost
- value: localhost:6379
- - name: redisPassword
- value: ""
+ - name: host
+ value: "amqp://localhost:5672"
+ - name: durable
+ value: "false"
+ - name: deletedWhenUnused
+ value: "false"
+ - name: autoAck
+ value: "false"
+ - name: reconnectWait
+ value: "0"
+ - name: concurrency
+ value: parallel
+scopes:
+ - orderprocessing
+ - checkout
```
You can override this file with another Redis instance or another [pubsub component]({{< ref setup-pubsub >}}) by creating a `components` directory containing the file and using the flag `--components-path` with the `dapr run` CLI command.
@@ -59,16 +74,27 @@ To deploy this into a Kubernetes cluster, fill in the `metadata` connection deta
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
- name: pubsub
+ name: order_pub_sub
namespace: default
spec:
- type: pubsub.redis
+ type: pubsub.rabbitmq
version: v1
metadata:
- - name: redisHost
- value: localhost:6379
- - name: redisPassword
- value: ""
+ - name: host
+ value: "amqp://localhost:5672"
+ - name: durable
+ value: "false"
+ - name: deletedWhenUnused
+ value: "false"
+ - name: autoAck
+ value: "false"
+ - name: reconnectWait
+ value: "0"
+ - name: concurrency
+ value: parallel
+scopes:
+ - orderprocessing
+ - checkout
```
{{% /codetab %}}
@@ -95,41 +121,72 @@ You can subscribe to a topic using the following Custom Resources Definition (CR
apiVersion: dapr.io/v1alpha1
kind: Subscription
metadata:
- name: myevent-subscription
+ name: order_pub_sub
spec:
- topic: deathStarStatus
- route: /dsstatus
- pubsubname: pubsub
+ topic: orders
+ route: /checkout
+ pubsubname: order_pub_sub
scopes:
-- app1
-- app2
+- orderprocessing
+- checkout
```
-The example above shows an event subscription to topic `deathStarStatus`, for the pubsub component `pubsub`.
-- The `route` field tells Dapr to send all topic messages to the `/dsstatus` endpoint in the app.
-- The `scopes` field enables this subscription for apps with IDs `app1` and `app2`.
+The example above shows an event subscription to topic `orders`, for the pubsub component `order_pub_sub`.
+- The `route` field tells Dapr to send all topic messages to the `/checkout` endpoint in the app.
+- The `scopes` field enables this subscription for apps with IDs `orderprocessing` and `checkout`.
Set the component with:
-{{< tabs "Self-Hosted (CLI)" Kubernetes>}}
-{{% codetab %}}
Place the CRD in your `./components` directory. When Dapr starts up, it loads subscriptions along with components.
Note: By default, Dapr loads components from `$HOME/.dapr/components` on MacOS/Linux and `%USERPROFILE%\.dapr\components` on Windows.
You can also override the default directory by pointing the Dapr CLI to a components path:
+{{< tabs Dotnet Java Python Go Javascript Kubernetes>}}
+
+{{% codetab %}}
+
```bash
-dapr run --app-id myapp --components-path ./myComponents -- python3 app1.py
+dapr run --app-id myapp --components-path ./myComponents -- dotnet run
```
-*Note: If you place the subscription in a custom components path, make sure the Pub/Sub component is present also.*
+{{% /codetab %}}
+
+{{% codetab %}}
+
+```bash
+dapr run --app-id myapp --components-path ./myComponents -- mvn spring-boot:run
+```
+
+{{% /codetab %}}
+
+{{% codetab %}}
+
+```bash
+dapr run --app-id myapp --components-path ./myComponents -- python3 app.py
+```
+
+{{% /codetab %}}
+
+{{% codetab %}}
+
+```bash
+dapr run --app-id myapp --components-path ./myComponents -- go run app.go
+```
+
+{{% /codetab %}}
+
+{{% codetab %}}
+
+```bash
+dapr run --app-id myapp --components-path ./myComponents -- npm start
+```
{{% /codetab %}}
{{% codetab %}}
In Kubernetes, save the CRD to a file and apply it to the cluster:
-
```bash
kubectl apply -f subscription.yaml
```
@@ -137,251 +194,233 @@ kubectl apply -f subscription.yaml
{{< /tabs >}}
-#### Example
+Below are code examples that leverage Dapr SDKs to subscribe to a topic.
-{{< tabs Python Node PHP>}}
-
-{{% codetab %}}
-Create a file named `app1.py` and paste in the following:
-```python
-import flask
-from flask import request, jsonify
-from flask_cors import CORS
-import json
-import sys
-
-app = flask.Flask(__name__)
-CORS(app)
-
-@app.route('/dsstatus', methods=['POST'])
-def ds_subscriber():
- print(request.json, flush=True)
- return json.dumps({'success':True}), 200, {'ContentType':'application/json'}
-
-app.run()
-```
-After creating `app1.py` ensure flask and flask_cors are installed:
-
-```bash
-pip install flask
-pip install flask_cors
-```
-
-Then run:
-
-```bash
-dapr --app-id app1 --app-port 5000 run python app1.py
-```
-{{% /codetab %}}
-
-{{% codetab %}}
-After setting up the subscription above, download this javascript (Node > 4.16) into a `app2.js` file:
-
-```javascript
-const express = require('express')
-const bodyParser = require('body-parser')
-const app = express()
-app.use(bodyParser.json({ type: 'application/*+json' }));
-
-const port = 3000
-
-app.post('/dsstatus', (req, res) => {
- console.log(req.body);
- res.sendStatus(200);
-});
-
-app.listen(port, () => console.log(`consumer app listening on port ${port}!`))
-```
-Run this app with:
-
-```bash
-dapr --app-id app2 --app-port 3000 run node app2.js
-```
-{{% /codetab %}}
+{{< tabs Dotnet Java Python Go Javascript>}}
{{% codetab %}}
-Create a file named `app1.php` and paste in the following:
+```csharp
+//dependencies
+using System.Collections.Generic;
+using System.Threading.Tasks;
+using System;
+using Microsoft.AspNetCore.Mvc;
+using Dapr;
+using Dapr.Client;
-```php
-post('/dsstatus', function(
- #[\Dapr\Attributes\FromBody]
- \Dapr\PubSub\CloudEvent $cloudEvent,
- \Psr\Log\LoggerInterface $logger
- ) {
- $logger->alert('Received event: {event}', ['event' => $cloudEvent]);
- return ['status' => 'SUCCESS'];
- }
-);
-$app->start();
-```
-
-After creating `app1.php`, and with the [SDK installed](https://docs.dapr.io/developing-applications/sdks/php/),
-go ahead and start the app:
-
-```bash
-dapr --app-id app1 --app-port 3000 run -- php -S 0.0.0.0:3000 app1.php
-```
-
-{{% /codetab %}}
-
-{{< /tabs >}}
-
-### Programmatic subscriptions
-
-To subscribe to topics, start a web server in the programming language of your choice and listen on the following `GET` endpoint: `/dapr/subscribe`.
-The Dapr instance calls into your app at startup and expect a JSON response for the topic subscriptions with:
-- `pubsubname`: Which pub/sub component Dapr should use.
-- `topic`: Which topic to subscribe to.
-- `route`: Which endpoint for Dapr to call on when a message comes to that topic.
-
-#### Example
-
-{{< tabs Python Node PHP>}}
-
-{{% codetab %}}
-```python
-import flask
-from flask import request, jsonify
-from flask_cors import CORS
-import json
-import sys
-
-app = flask.Flask(__name__)
-CORS(app)
-
-@app.route('/dapr/subscribe', methods=['GET'])
-def subscribe():
- subscriptions = [{'pubsubname': 'pubsub',
- 'topic': 'deathStarStatus',
- 'route': 'dsstatus'}]
- return jsonify(subscriptions)
-
-@app.route('/dsstatus', methods=['POST'])
-def ds_subscriber():
- print(request.json, flush=True)
- return json.dumps({'success':True}), 200, {'ContentType':'application/json'}
-app.run()
-```
-After creating `app1.py` ensure flask and flask_cors are installed:
-
-```bash
-pip install flask
-pip install flask_cors
-```
-
-Then run:
-
-```bash
-dapr --app-id app1 --app-port 5000 run python app1.py
-```
-{{% /codetab %}}
-
-{{% codetab %}}
-```javascript
-const express = require('express')
-const bodyParser = require('body-parser')
-const app = express()
-app.use(bodyParser.json({ type: 'application/*+json' }));
-
-const port = 3000
-
-app.get('/dapr/subscribe', (req, res) => {
- res.json([
+//code
+namespace CheckoutService.controller
+{
+ [ApiController]
+ public class CheckoutServiceController : Controller
+ {
+ //Subscribe to a topic
+ [Topic("order_pub_sub", "orders")]
+ [HttpPost("checkout")]
+ public void getCheckout([FromBody] int orderId)
{
- pubsubname: "pubsub",
- topic: "deathStarStatus",
- route: "dsstatus"
+ Console.WriteLine("Subscriber received : " + orderId);
}
- ]);
-})
-
-app.post('/dsstatus', (req, res) => {
- console.log(req.body);
- res.sendStatus(200);
-});
-
-app.listen(port, () => console.log(`consumer app listening on port ${port}!`))
+ }
+}
```
-Run this app with:
+
+Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
```bash
-dapr --app-id app2 --app-port 3000 run node app2.js
+dapr run --app-id checkout --app-port 6002 --dapr-http-port 3602 --dapr-grpc-port 60002 --app-ssl dotnet run
```
+
{{% /codetab %}}
{{% codetab %}}
-Update `app1.php` with the following:
+```java
+//dependencies
+import io.dapr.Topic;
+import io.dapr.client.domain.CloudEvent;
+import org.springframework.web.bind.annotation.*;
+import com.fasterxml.jackson.databind.ObjectMapper;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+import reactor.core.publisher.Mono;
-```php
- $builder->addDefinitions(['dapr.subscriptions' => [
- new \Dapr\PubSub\Subscription(pubsubname: 'pubsub', topic: 'deathStarStatus', route: '/dsstatus'),
-]]));
-$app->post('/dsstatus', function(
- #[\Dapr\Attributes\FromBody]
- \Dapr\PubSub\CloudEvent $cloudEvent,
- \Psr\Log\LoggerInterface $logger
- ) {
- $logger->alert('Received event: {event}', ['event' => $cloudEvent]);
- return ['status' => 'SUCCESS'];
+ private static final Logger log = LoggerFactory.getLogger(CheckoutServiceController.class);
+ //Subscribe to a topic
+ @Topic(name = "orders", pubsubName = "order_pub_sub")
+ @PostMapping(path = "/checkout")
+ public Mono getCheckout(@RequestBody(required = false) CloudEvent cloudEvent) {
+ return Mono.fromRunnable(() -> {
+ try {
+ log.info("Subscriber received: " + cloudEvent.getData());
+ } catch (Exception e) {
+ throw new RuntimeException(e);
+ }
+ });
}
-);
-$app->start();
+}
```
-Run this app with:
+Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
```bash
-dapr --app-id app1 --app-port 3000 run -- php -S 0.0.0.0:3000 app1.php
+dapr run --app-id checkout --app-port 6002 --dapr-http-port 3602 --dapr-grpc-port 60002 mvn spring-boot:run
+```
+
+{{% /codetab %}}
+
+{{% codetab %}}
+
+```python
+#dependencies
+from cloudevents.sdk.event import v1
+from dapr.ext.grpc import App
+import logging
+import json
+
+#code
+app = App()
+logging.basicConfig(level = logging.INFO)
+#Subscribe to a topic
+@app.subscribe(pubsub_name='order_pub_sub', topic='orders')
+def mytopic(event: v1.Event) -> None:
+ data = json.loads(event.Data())
+ logging.info('Subscriber received: ' + str(data))
+
+app.run(6002)
+```
+
+Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
+
+```bash
+dapr run --app-id checkout --app-port 6002 --dapr-http-port 3602 --app-protocol grpc -- python3 CheckoutService.py
+```
+
+{{% /codetab %}}
+
+{{% codetab %}}
+
+```go
+//dependencies
+import (
+ "log"
+ "net/http"
+ "context"
+
+ "github.com/dapr/go-sdk/service/common"
+ daprd "github.com/dapr/go-sdk/service/http"
+)
+
+//code
+var sub = &common.Subscription{
+ PubsubName: "order_pub_sub",
+ Topic: "orders",
+ Route: "/checkout",
+}
+
+func main() {
+ s := daprd.NewService(":6002")
+ //Subscribe to a topic
+ if err := s.AddTopicEventHandler(sub, eventHandler); err != nil {
+ log.Fatalf("error adding topic subscription: %v", err)
+ }
+ if err := s.Start(); err != nil && err != http.ErrServerClosed {
+ log.Fatalf("error listenning: %v", err)
+ }
+}
+
+func eventHandler(ctx context.Context, e *common.TopicEvent) (retry bool, err error) {
+ log.Printf("Subscriber received: %s", e.Data)
+ return false, nil
+}
+```
+
+Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
+
+```bash
+dapr run --app-id checkout --app-port 6002 --dapr-http-port 3602 --dapr-grpc-port 60002 go run CheckoutService.go
+```
+
+{{% /codetab %}}
+
+{{% codetab %}}
+
+```javascript
+//dependencies
+import { DaprServer, CommunicationProtocolEnum } from 'dapr-client';
+
+//code
+const daprHost = "127.0.0.1";
+const serverHost = "127.0.0.1";
+const serverPort = "6002";
+
+start().catch((e) => {
+ console.error(e);
+ process.exit(1);
+});
+
+async function start(orderId) {
+ const server = new DaprServer(
+ serverHost,
+ serverPort,
+ daprHost,
+ process.env.DAPR_HTTP_PORT,
+ CommunicationProtocolEnum.HTTP
+ );
+ //Subscribe to a topic
+ await server.pubsub.subscribe("order_pub_sub", "orders", async (orderId) => {
+ console.log(`Subscriber received: ${JSON.stringify(orderId)}`)
+ });
+ await server.startServer();
+}
+```
+
+Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
+
+```bash
+dapr run --app-id checkout --app-port 6002 --dapr-http-port 3602 --dapr-grpc-port 60002 npm start
```
{{% /codetab %}}
{{< /tabs >}}
-The `/dsstatus` endpoint matches the `route` defined in the subscriptions and this is where Dapr will send all topic messages to.
+The `/checkout` endpoint matches the `route` defined in the subscriptions and this is where Dapr will send all topic messages to.
## Step 3: Publish a topic
-To publish a topic you need to run an instance of a Dapr sidecar to use the pubsub Redis component. You can use the default Redis component installed into your local environment.
-
-Start an instance of Dapr with an app-id called `testpubsub`:
+Start an instance of Dapr with an app-id called `orderprocessing`:
```bash
-dapr run --app-id testpubsub --dapr-http-port 3500
+dapr run --app-id orderprocessing --dapr-http-port 3601
```
{{< tabs "Dapr CLI" "HTTP API (Bash)" "HTTP API (PowerShell)">}}
{{% codetab %}}
-Then publish a message to the `deathStarStatus` topic:
+Then publish a message to the `orders` topic:
```bash
-dapr publish --publish-app-id testpubsub --pubsub pubsub --topic deathStarStatus --data '{"status": "completed"}'
+dapr publish --publish-app-id orderprocessing --pubsub order_pub_sub --topic orders --data '{"orderId": "100"}'
```
{{% /codetab %}}
{{% codetab %}}
-Then publish a message to the `deathStarStatus` topic:
+Then publish a message to the `orders` topic:
```bash
-curl -X POST http://localhost:3500/v1.0/publish/pubsub/deathStarStatus -H "Content-Type: application/json" -d '{"status": "completed"}'
+curl -X POST http://localhost:3601/v1.0/publish/order_pub_sub/orders -H "Content-Type: application/json" -d '{"orderId": "100"}'
```
{{% /codetab %}}
{{% codetab %}}
-Then publish a message to the `deathStarStatus` topic:
+Then publish a message to the `orders` topic:
```powershell
-Invoke-RestMethod -Method Post -ContentType 'application/json' -Body '{"status": "completed"}' -Uri 'http://localhost:3500/v1.0/publish/pubsub/deathStarStatus'
+Invoke-RestMethod -Method Post -ContentType 'application/json' -Body '{"orderId": "100"}' -Uri 'http://localhost:3601/v1.0/publish/order_pub_sub/orders'
```
{{% /codetab %}}
@@ -389,96 +428,240 @@ Invoke-RestMethod -Method Post -ContentType 'application/json' -Body '{"status":
Dapr automatically wraps the user payload in a Cloud Events v1.0 compliant envelope, using `Content-Type` header value for `datacontenttype` attribute.
-## Step 4: ACK-ing a message
+Below are code examples that leverage Dapr SDKs to publish a topic.
-In order to tell Dapr that a message was processed successfully, return a `200 OK` response. If Dapr receives any other return status code than `200`, or if your app crashes, Dapr will attempt to redeliver the message following At-Least-Once semantics.
-
-#### Example
-
-{{< tabs Python Node>}}
-
-{{% codetab %}}
-```python
-@app.route('/dsstatus', methods=['POST'])
-def ds_subscriber():
- print(request.json, flush=True)
- return json.dumps({'success':True}), 200, {'ContentType':'application/json'}
-```
-{{% /codetab %}}
-
-{{% codetab %}}
-```javascript
-app.post('/dsstatus', (req, res) => {
- res.sendStatus(200);
-});
-```
-{{% /codetab %}}
-
-{{< /tabs >}}
-
-{{% alert title="Note on message redelivery" color="primary" %}}
-Some pubsub components (e.g. Redis) will redeliver a message if a response is not sent back within a specified time window. Make sure to configure metadata such as `processingTimeout` to customize this behavior. For more information refer to the respective [component references]({{< ref supported-pubsub >}}).
-{{% /alert %}}
-
-## (Optional) Step 5: Publishing a topic with code
-
-{{< tabs Node PHP>}}
-
-{{% codetab %}}
-If you prefer publishing a topic using code, here is an example.
-
-```javascript
-const express = require('express');
-const path = require('path');
-const request = require('request');
-const bodyParser = require('body-parser');
-
-const app = express();
-app.use(bodyParser.json());
-
-const daprPort = process.env.DAPR_HTTP_PORT || 3500;
-const daprUrl = `http://localhost:${daprPort}/v1.0`;
-const port = 8080;
-const pubsubName = 'pubsub';
-
-app.post('/publish', (req, res) => {
- console.log("Publishing: ", req.body);
- const publishUrl = `${daprUrl}/publish/${pubsubName}/deathStarStatus`;
- request( { uri: publishUrl, method: 'POST', json: req.body } );
- res.sendStatus(200);
-});
-
-app.listen(process.env.PORT || port, () => console.log(`Listening on port ${port}!`));
-```
-{{% /codetab %}}
+{{< tabs Dotnet Java Python Go Javascript>}}
{{% codetab %}}
-If you prefer publishing a topic using code, here is an example.
+```csharp
+//dependencies
+using System;
+using System.Collections.Generic;
+using System.Net.Http;
+using System.Net.Http.Headers;
+using System.Threading.Tasks;
+using Dapr.Client;
+using Microsoft.AspNetCore.Mvc;
+using System.Threading;
-```php
-run(function(\DI\FactoryInterface $factory, \Psr\Log\LoggerInterface $logger) {
- $publisher = $factory->make(\Dapr\PubSub\Publish::class, ['pubsub' => 'pubsub']);
- $publisher->topic('deathStarStatus')->publish('operational');
- $logger->alert('published!');
-});
+//code
+namespace EventService
+{
+ class Program
+ {
+ static async Task Main(string[] args)
+ {
+ string PUBSUB_NAME = "order_pub_sub";
+ string TOPIC_NAME = "orders";
+ while(true) {
+ System.Threading.Thread.Sleep(5000);
+ Random random = new Random();
+ int orderId = random.Next(1,1000);
+ CancellationTokenSource source = new CancellationTokenSource();
+ CancellationToken cancellationToken = source.Token;
+ using var client = new DaprClientBuilder().Build();
+ //Using Dapr SDK to publish a topic
+ await client.PublishEventAsync(PUBSUB_NAME, TOPIC_NAME, orderId, cancellationToken);
+ Console.WriteLine("Published data: " + orderId);
+ }
+ }
+ }
+}
```
-You can save this to `app2.php` and while `app1` is running in another terminal, execute:
+Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
```bash
-dapr --app-id app2 run -- php app2.php
+dapr run --app-id orderprocessing --app-port 6001 --dapr-http-port 3601 --dapr-grpc-port 60001 --app-ssl dotnet run
+```
+
+{{% /codetab %}}
+
+{{% codetab %}}
+
+```java
+//dependencies
+import io.dapr.client.DaprClient;
+import io.dapr.client.DaprClientBuilder;
+import io.dapr.client.domain.Metadata;
+import static java.util.Collections.singletonMap;
+import org.springframework.boot.autoconfigure.SpringBootApplication;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+import java.util.Random;
+import java.util.concurrent.TimeUnit;
+
+//code
+@SpringBootApplication
+public class OrderProcessingServiceApplication {
+
+ private static final Logger log = LoggerFactory.getLogger(OrderProcessingServiceApplication.class);
+
+ public static void main(String[] args) throws InterruptedException{
+ String MESSAGE_TTL_IN_SECONDS = "1000";
+ String TOPIC_NAME = "orders";
+ String PUBSUB_NAME = "order_pub_sub";
+
+ while(true) {
+ TimeUnit.MILLISECONDS.sleep(5000);
+ Random random = new Random();
+ int orderId = random.nextInt(1000-1) + 1;
+ DaprClient client = new DaprClientBuilder().build();
+ //Using Dapr SDK to publish a topic
+ client.publishEvent(
+ PUBSUB_NAME,
+ TOPIC_NAME,
+ orderId,
+ singletonMap(Metadata.TTL_IN_SECONDS, MESSAGE_TTL_IN_SECONDS)).block();
+ log.info("Published data:" + orderId);
+ }
+ }
+}
+```
+
+Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
+
+```bash
+dapr run --app-id orderprocessing --app-port 6001 --dapr-http-port 3601 --dapr-grpc-port 60001 mvn spring-boot:run
+```
+
+{{% /codetab %}}
+
+{{% codetab %}}
+
+```python
+#dependencies
+import random
+from time import sleep
+import requests
+import logging
+import json
+from dapr.clients import DaprClient
+
+#code
+logging.basicConfig(level = logging.INFO)
+while True:
+ sleep(random.randrange(50, 5000) / 1000)
+ orderId = random.randint(1, 1000)
+ PUBSUB_NAME = 'order_pub_sub'
+ TOPIC_NAME = 'orders'
+ with DaprClient() as client:
+ #Using Dapr SDK to publish a topic
+ result = client.publish_event(
+ pubsub_name=PUBSUB_NAME,
+ topic_name=TOPIC_NAME,
+ data=json.dumps(orderId),
+ data_content_type='application/json',
+ )
+ logging.info('Published data: ' + str(orderId))
+```
+
+Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
+
+```bash
+dapr run --app-id orderprocessing --app-port 6001 --dapr-http-port 3601 --app-protocol grpc python3 OrderProcessingService.py
+```
+
+{{% /codetab %}}
+
+{{% codetab %}}
+
+```go
+//dependencies
+import (
+ "context"
+ "log"
+ "math/rand"
+ "time"
+ "strconv"
+ dapr "github.com/dapr/go-sdk/client"
+)
+
+//code
+var (
+ PUBSUB_NAME = "order_pub_sub"
+ TOPIC_NAME = "orders"
+)
+
+func main() {
+ for i := 0; i < 10; i++ {
+ time.Sleep(5000)
+ orderId := rand.Intn(1000-1) + 1
+ client, err := dapr.NewClient()
+ if err != nil {
+ panic(err)
+ }
+ defer client.Close()
+ ctx := context.Background()
+ //Using Dapr SDK to publish a topic
+ if err := client.PublishEvent(ctx, PUBSUB_NAME, TOPIC_NAME, []byte(strconv.Itoa(orderId)));
+ err != nil {
+ panic(err)
+ }
+
+ log.Println("Published data: " + strconv.Itoa(orderId))
+ }
+}
+```
+
+Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
+
+```bash
+dapr run --app-id orderprocessing --app-port 6001 --dapr-http-port 3601 --dapr-grpc-port 60001 go run OrderProcessingService.go
+```
+
+{{% /codetab %}}
+
+{{% codetab %}}
+
+```javascript
+//dependencies
+import { DaprServer, DaprClient, CommunicationProtocolEnum } from 'dapr-client';
+
+const daprHost = "127.0.0.1";
+
+var main = function() {
+ for(var i=0;i<10;i++) {
+ sleep(5000);
+ var orderId = Math.floor(Math.random() * (1000 - 1) + 1);
+ start(orderId).catch((e) => {
+ console.error(e);
+ process.exit(1);
+ });
+ }
+}
+
+async function start(orderId) {
+ const PUBSUB_NAME = "order_pub_sub"
+ const TOPIC_NAME = "orders"
+ const client = new DaprClient(daprHost, process.env.DAPR_HTTP_PORT, CommunicationProtocolEnum.HTTP);
+ console.log("Published data:" + orderId)
+ //Using Dapr SDK to publish a topic
+ await client.pubsub.publish(PUBSUB_NAME, TOPIC_NAME, orderId);
+}
+
+function sleep(ms) {
+ return new Promise(resolve => setTimeout(resolve, ms));
+}
+
+main();
+```
+
+Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
+
+```bash
+dapr run --app-id orderprocessing --app-port 6001 --dapr-http-port 3601 --dapr-grpc-port 60001 npm start
```
{{% /codetab %}}
{{< /tabs >}}
+## Step 4: ACK-ing a message
+
+In order to tell Dapr that a message was processed successfully, return a `200 OK` response. If Dapr receives any other return status code than `200`, or if your app crashes, Dapr will attempt to redeliver the message following at-least-once semantics.
+
## Sending a custom CloudEvent
Dapr automatically takes the data sent on the publish request and wraps it in a CloudEvent 1.0 envelope.
@@ -491,23 +674,23 @@ Read about content types [here](#content-types), and about the [Cloud Events mes
{{< tabs "Dapr CLI" "HTTP API (Bash)" "HTTP API (PowerShell)">}}
{{% codetab %}}
-Publish a custom CloudEvent to the `deathStarStatus` topic:
+Publish a custom CloudEvent to the `orders` topic:
```bash
-dapr publish --publish-app-id testpubsub --pubsub pubsub --topic deathStarStatus --data '{"specversion" : "1.0", "type" : "com.dapr.cloudevent.sent", "source" : "testcloudeventspubsub", "subject" : "Cloud Events Test", "id" : "someCloudEventId", "time" : "2021-08-02T09:00:00Z", "datacontenttype" : "application/cloudevents+json", "data" : {"status": "completed"}}'
+dapr publish --publish-app-id orderprocessing --pubsub order_pub_sub --topic orders --data '{"specversion" : "1.0", "type" : "com.dapr.cloudevent.sent", "source" : "testcloudeventspubsub", "subject" : "Cloud Events Test", "id" : "someCloudEventId", "time" : "2021-08-02T09:00:00Z", "datacontenttype" : "application/cloudevents+json", "data" : {"orderId": "100"}}'
```
{{% /codetab %}}
{{% codetab %}}
-Publish a custom CloudEvent to the `deathStarStatus` topic:
+Publish a custom CloudEvent to the `orders` topic:
```bash
-curl -X POST http://localhost:3500/v1.0/publish/pubsub/deathStarStatus -H "Content-Type: application/cloudevents+json" -d '{"specversion" : "1.0", "type" : "com.dapr.cloudevent.sent", "source" : "testcloudeventspubsub", "subject" : "Cloud Events Test", "id" : "someCloudEventId", "time" : "2021-08-02T09:00:00Z", "datacontenttype" : "application/cloudevents+json", "data" : {"status": "completed"}}'
+curl -X POST http://localhost:3601/v1.0/publish/order_pub_sub/orders -H "Content-Type: application/cloudevents+json" -d '{"specversion" : "1.0", "type" : "com.dapr.cloudevent.sent", "source" : "testcloudeventspubsub", "subject" : "Cloud Events Test", "id" : "someCloudEventId", "time" : "2021-08-02T09:00:00Z", "datacontenttype" : "application/cloudevents+json", "data" : {"orderId": "100"}}'
```
{{% /codetab %}}
{{% codetab %}}
-Publish a custom CloudEvent to the `deathStarStatus` topic:
+Publish a custom CloudEvent to the `orders` topic:
```powershell
-Invoke-RestMethod -Method Post -ContentType 'application/cloudevents+json' -Body '{"specversion" : "1.0", "type" : "com.dapr.cloudevent.sent", "source" : "testcloudeventspubsub", "subject" : "Cloud Events Test", "id" : "someCloudEventId", "time" : "2021-08-02T09:00:00Z", "datacontenttype" : "application/cloudevents+json", "data" : {"status": "completed"}}' -Uri 'http://localhost:3500/v1.0/publish/pubsub/deathStarStatus'
+Invoke-RestMethod -Method Post -ContentType 'application/cloudevents+json' -Body '{"specversion" : "1.0", "type" : "com.dapr.cloudevent.sent", "source" : "testcloudeventspubsub", "subject" : "Cloud Events Test", "id" : "someCloudEventId", "time" : "2021-08-02T09:00:00Z", "datacontenttype" : "application/cloudevents+json", "data" : {"orderId": "100"}}' -Uri 'http://localhost:3601/v1.0/publish/order_pub_sub/orders'
```
{{% /codetab %}}
@@ -521,4 +704,4 @@ Invoke-RestMethod -Method Post -ContentType 'application/cloudevents+json' -Body
- Learn about [message time-to-live]({{< ref pubsub-message-ttl.md >}})
- Learn [how to configure Pub/Sub components with multiple namespaces]({{< ref pubsub-namespaces.md >}})
- List of [pub/sub components]({{< ref setup-pubsub >}})
-- Read the [API reference]({{< ref pubsub_api.md >}})
\ No newline at end of file
+- Read the [API reference]({{< ref pubsub_api.md >}})
diff --git a/daprdocs/content/en/developing-applications/building-blocks/secrets/howto-secrets.md b/daprdocs/content/en/developing-applications/building-blocks/secrets/howto-secrets.md
index 73565e055..4ac50cf55 100644
--- a/daprdocs/content/en/developing-applications/building-blocks/secrets/howto-secrets.md
+++ b/daprdocs/content/en/developing-applications/building-blocks/secrets/howto-secrets.md
@@ -55,7 +55,6 @@ To configure a different kind of secret store see the guidance on [how to config
Run the Dapr sidecar with the application.
-
{{< tabs Dotnet Java Python Go Javascript>}}
{{% codetab %}}
@@ -110,9 +109,16 @@ Once you have a secret store, call Dapr to get the secrets from your application
{{% codetab %}}
```csharp
-
//dependencies
+using System;
+using System.Collections.Generic;
+using System.Net.Http;
+using System.Net.Http.Headers;
+using System.Threading.Tasks;
using Dapr.Client;
+using Microsoft.AspNetCore.Mvc;
+using System.Threading;
+using System.Text.Json;
//code
namespace EventService
@@ -126,54 +132,59 @@ namespace EventService
//Using Dapr SDK to get a secret
var secret = await client.GetSecretAsync(SECRET_STORE_NAME, "secret");
Console.WriteLine($"Result: {string.Join(", ", secret)}");
- Console.WriteLine($"Result for bulk: {string.Join(", ", secret.Keys)}");
}
}
}
-
```
{{% /codetab %}}
{{% codetab %}}
```java
-
//dependencies
+import com.fasterxml.jackson.core.JsonProcessingException;
+import com.fasterxml.jackson.databind.ObjectMapper;
import io.dapr.client.DaprClient;
import io.dapr.client.DaprClientBuilder;
+import org.springframework.boot.autoconfigure.SpringBootApplication;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+import java.util.Map;
+
//code
@SpringBootApplication
public class OrderProcessingServiceApplication {
- private static final Logger log = LoggerFactory.getLogger(OrderProcessingServiceApplication.class);
- private static final ObjectMapper JSON_SERIALIZER = new ObjectMapper();
+ private static final Logger log = LoggerFactory.getLogger(OrderProcessingServiceApplication.class);
+ private static final ObjectMapper JSON_SERIALIZER = new ObjectMapper();
- private static final String SECRET_STORE_NAME = "localsecretstore";
+ private static final String SECRET_STORE_NAME = "localsecretstore";
- public static void main(String[] args) throws InterruptedException, JsonProcessingException {
- DaprClient client = new DaprClientBuilder().build();
+ public static void main(String[] args) throws InterruptedException, JsonProcessingException {
+ DaprClient client = new DaprClientBuilder().build();
//Using Dapr SDK to get a secret
- Map secret = client.getSecret(SECRET_STORE_NAME, "secret").block();
- log.info("Result: " + JSON_SERIALIZER.writeValueAsString(secret));
- }
+ Map secret = client.getSecret(SECRET_STORE_NAME, "secret").block();
+ log.info("Result: " + JSON_SERIALIZER.writeValueAsString(secret));
+ }
}
-
```
{{% /codetab %}}
{{% codetab %}}
```python
-
#dependencies
+import random
+from time import sleep
+import requests
+import logging
from dapr.clients import DaprClient
from dapr.clients.grpc._state import StateItem
from dapr.clients.grpc._request import TransactionalStateOperation, TransactionOperationType
#code
logging.basicConfig(level = logging.INFO)
-
DAPR_STORE_NAME = "localsecretstore"
key = 'secret'
@@ -182,21 +193,21 @@ with DaprClient() as client:
secret = client.get_secret(store_name=DAPR_STORE_NAME, key=key)
logging.info('Result: ')
logging.info(secret.secret)
+ #Using Dapr SDK to get bulk secrets
secret = client.get_bulk_secret(store_name=DAPR_STORE_NAME)
logging.info('Result for bulk secret: ')
logging.info(sorted(secret.secrets.items()))
-
```
{{% /codetab %}}
{{% codetab %}}
```go
-
//dependencies
import (
"context"
"log"
+
dapr "github.com/dapr/go-sdk/client"
)
@@ -209,35 +220,26 @@ func main() {
}
defer client.Close()
ctx := context.Background()
-
+ //Using Dapr SDK to get a secret
secret, err := client.GetSecret(ctx, SECRET_STORE_NAME, "secret", nil)
- if err != nil {
- return nil, errors.Wrap(err, "Got error for accessing key")
- }
-
if secret != nil {
log.Println("Result : ")
log.Println(secret)
}
-
- secretRandom, err := client.GetBulkSecret(ctx, SECRET_STORE_NAME, nil)
- if err != nil {
- return nil, errors.Wrap(err, "Got error for accessing key")
- }
+ //Using Dapr SDK to get bulk secrets
+ secretBulk, err := client.GetBulkSecret(ctx, SECRET_STORE_NAME, nil)
if secret != nil {
log.Println("Result for bulk: ")
- log.Println(secretRandom)
+ log.Println(secretBulk)
}
}
-
```
{{% /codetab %}}
{{% codetab %}}
```javascript
-
//dependencies
import { DaprClient, HttpMethod, CommunicationProtocolEnum } from 'dapr-client';
@@ -247,14 +249,15 @@ const daprHost = "127.0.0.1";
async function main() {
const client = new DaprClient(daprHost, process.env.DAPR_HTTP_PORT, CommunicationProtocolEnum.HTTP);
const SECRET_STORE_NAME = "localsecretstore";
+ //Using Dapr SDK to get a secret
var secret = await client.secret.get(SECRET_STORE_NAME, "secret");
console.log("Result: " + secret);
+ //Using Dapr SDK to get bulk secrets
secret = await client.secret.getBulk(SECRET_STORE_NAME);
console.log("Result for bulk: " + secret);
}
main();
-
```
{{% /codetab %}}
@@ -267,4 +270,4 @@ main();
- [Configure a secret store]({{}})
- [Supported secrets]({{}})
- [Using secrets in components]({{}})
-- [Secret stores quickstart](https://github.com/dapr/quickstarts/tree/master/secretstore)
\ No newline at end of file
+- [Secret stores quickstart](https://github.com/dapr/quickstarts/tree/master/secretstore)
diff --git a/daprdocs/content/en/developing-applications/building-blocks/service-invocation/howto-invoke-discover-services.md b/daprdocs/content/en/developing-applications/building-blocks/service-invocation/howto-invoke-discover-services.md
index 4f0cb2127..77dd0e386 100644
--- a/daprdocs/content/en/developing-applications/building-blocks/service-invocation/howto-invoke-discover-services.md
+++ b/daprdocs/content/en/developing-applications/building-blocks/service-invocation/howto-invoke-discover-services.md
@@ -183,9 +183,15 @@ Below are code examples that leverage Dapr SDKs for service invocation.
{{% codetab %}}
```csharp
-
//dependencies
+using System;
+using System.Collections.Generic;
+using System.Net.Http;
+using System.Net.Http.Headers;
+using System.Threading.Tasks;
using Dapr.Client;
+using Microsoft.AspNetCore.Mvc;
+using System.Threading;
//code
namespace EventService
@@ -194,113 +200,164 @@ namespace EventService
{
static async Task Main(string[] args)
{
- int orderId = 100;
- CancellationTokenSource source = new CancellationTokenSource();
- CancellationToken cancellationToken = source.Token;
- //Using Dapr SDK to invoke a method
- using var client = new DaprClientBuilder().Build();
- var result = client.CreateInvokeMethodRequest(HttpMethod.Get, "checkout", "checkout/" + orderId, cancellationToken);
- await client.InvokeMethodAsync(result);
+ while(true) {
+ System.Threading.Thread.Sleep(5000);
+ Random random = new Random();
+ int orderId = random.Next(1,1000);
+ CancellationTokenSource source = new CancellationTokenSource();
+ CancellationToken cancellationToken = source.Token;
+ using var client = new DaprClientBuilder().Build();
+ //Using Dapr SDK to invoke a method
+ var result = client.CreateInvokeMethodRequest(HttpMethod.Get, "checkout", "checkout/" + orderId, cancellationToken);
+ await client.InvokeMethodAsync(result);
+ Console.WriteLine("Order requested: " + orderId);
+ Console.WriteLine("Result: " + result);
+ }
}
}
}
-
```
{{% /codetab %}}
{{% codetab %}}
```java
-
//dependencies
import io.dapr.client.DaprClient;
import io.dapr.client.DaprClientBuilder;
import io.dapr.client.domain.HttpExtension;
+import org.springframework.boot.autoconfigure.SpringBootApplication;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+import java.util.Random;
+import java.util.concurrent.TimeUnit;
//code
@SpringBootApplication
public class OrderProcessingServiceApplication {
- public static void main(String[] args) throws InterruptedException {
- int orderId = 100;
- //Using Dapr SDK to invoke a method
- DaprClient client = new DaprClientBuilder().build();
- var result = client.invokeMethod(
- "checkout",
- "checkout/" + orderId,
- null,
- HttpExtension.GET,
- String.class
- );
+
+ private static final Logger log = LoggerFactory.getLogger(OrderProcessingServiceApplication.class);
+
+ public static void main(String[] args) throws InterruptedException{
+ while(true) {
+ TimeUnit.MILLISECONDS.sleep(5000);
+ Random random = new Random();
+ int orderId = random.nextInt(1000-1) + 1;
+ DaprClient daprClient = new DaprClientBuilder().build();
+ //Using Dapr SDK to invoke a method
+ var result = daprClient.invokeMethod(
+ "checkout",
+ "checkout/" + orderId,
+ null,
+ HttpExtension.GET,
+ String.class
+ );
+ log.info("Order requested: " + orderId);
+ log.info("Result: " + result);
+ }
}
}
-
```
{{% /codetab %}}
{{% codetab %}}
```python
-
#dependencies
+import random
+from time import sleep
+import logging
from dapr.clients import DaprClient
#code
-orderId = 100
-#Using Dapr SDK to invoke a method
-with DaprClient() as client:
- result = client.invoke_method(
- "checkout",
- f"checkout/{orderId}",
- data=b'',
- http_verb="GET"
- )
-
+logging.basicConfig(level = logging.INFO)
+while True:
+ sleep(random.randrange(50, 5000) / 1000)
+ orderId = random.randint(1, 1000)
+ with DaprClient() as daprClient:
+ #Using Dapr SDK to invoke a method
+ result = daprClient.invoke_method(
+ "checkout",
+ f"checkout/{orderId}",
+ data=b'',
+ http_verb="GET"
+ )
+ logging.basicConfig(level = logging.INFO)
+ logging.info('Order requested: ' + str(orderId))
+ logging.info('Result: ' + str(result))
```
{{% /codetab %}}
{{% codetab %}}
```go
-
//dependencies
import (
+ "context"
+ "log"
+ "math/rand"
+ "time"
"strconv"
dapr "github.com/dapr/go-sdk/client"
)
//code
-func main() {
- orderId := 100
- //Using Dapr SDK to invoke a method
- client, err := dapr.NewClient()
- if err != nil {
- panic(err)
- }
- defer client.Close()
- ctx := context.Background()
- result, err := client.InvokeMethod(ctx, "checkout", "checkout/" + strconv.Itoa(orderId), "get")
+type Order struct {
+ orderName string
+ orderNum string
}
+func main() {
+ for i := 0; i < 10; i++ {
+ time.Sleep(5000)
+ orderId := rand.Intn(1000-1) + 1
+ client, err := dapr.NewClient()
+ if err != nil {
+ panic(err)
+ }
+ defer client.Close()
+ ctx := context.Background()
+ //Using Dapr SDK to invoke a method
+ result, err := client.InvokeMethod(ctx, "checkout", "checkout/" + strconv.Itoa(orderId), "get")
+ log.Println("Order requested: " + strconv.Itoa(orderId))
+ log.Println("Result: ")
+ log.Println(result)
+ }
+}
```
{{% /codetab %}}
{{% codetab %}}
```javascript
-
//dependencies
import { DaprClient, HttpMethod, CommunicationProtocolEnum } from 'dapr-client';
//code
-const daprHost = "127.0.0.1";
+const daprHost = "127.0.0.1";
var main = function() {
- var orderId = 100;
- //Using Dapr SDK to invoke a method
- const client = new DaprClient(daprHost, process.env.DAPR_HTTP_PORT, CommunicationProtocolEnum.HTTP);
- const result = await client.invoker.invoke('checkout' , "checkout/" + orderId , HttpMethod.GET);
+ for(var i=0;i<10;i++) {
+ sleep(5000);
+ var orderId = Math.floor(Math.random() * (1000 - 1) + 1);
+ start(orderId).catch((e) => {
+ console.error(e);
+ process.exit(1);
+ });
+ }
+}
+
+async function start(orderId) {
+ const client = new DaprClient(daprHost, process.env.DAPR_HTTP_PORT, CommunicationProtocolEnum.HTTP);
+ //Using Dapr SDK to invoke a method
+ const result = await client.invoker.invoke('checkoutservice' , "checkout/" + orderId , HttpMethod.GET);
+ console.log("Order requested: " + orderId);
+ console.log("Result: " + result);
+}
+
+function sleep(ms) {
+ return new Promise(resolve => setTimeout(resolve, ms));
}
main();
-
```
{{% /codetab %}}
diff --git a/daprdocs/content/en/developing-applications/building-blocks/state-management/howto-get-save-state.md b/daprdocs/content/en/developing-applications/building-blocks/state-management/howto-get-save-state.md
index 1684b7ddb..e40eb7995 100644
--- a/daprdocs/content/en/developing-applications/building-blocks/state-management/howto-get-save-state.md
+++ b/daprdocs/content/en/developing-applications/building-blocks/state-management/howto-get-save-state.md
@@ -81,7 +81,15 @@ Below are code examples that leverage Dapr SDKs for saving and retrieving a sing
```csharp
//dependencies
+using System;
+using System.Collections.Generic;
+using System.Net.Http;
+using System.Net.Http.Headers;
+using System.Threading.Tasks;
using Dapr.Client;
+using Microsoft.AspNetCore.Mvc;
+using System.Threading;
+using System.Text.Json;
//code
namespace EventService
@@ -91,17 +99,20 @@ namespace EventService
static async Task Main(string[] args)
{
string DAPR_STORE_NAME = "statestore";
-
- int orderId = 100;
- //Using Dapr SDK to save and get state
- using var client = new DaprClientBuilder().Build();
- await client.SaveStateAsync(DAPR_STORE_NAME, "order_1", orderId.ToString());
- await client.SaveStateAsync(DAPR_STORE_NAME, "order_2", orderId.ToString());
- var result = await client.GetStateAsync(DAPR_STORE_NAME, orderId.ToString());
+ while(true) {
+ System.Threading.Thread.Sleep(5000);
+ using var client = new DaprClientBuilder().Build();
+ Random random = new Random();
+ int orderId = random.Next(1,1000);
+ //Using Dapr SDK to save and get state
+ await client.SaveStateAsync(DAPR_STORE_NAME, "order_1", orderId.ToString());
+ await client.SaveStateAsync(DAPR_STORE_NAME, "order_2", orderId.ToString());
+ var result = await client.GetStateAsync(DAPR_STORE_NAME, orderId.ToString());
+ Console.WriteLine("Result after get: " + result);
+ }
}
}
}
-
```
Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
@@ -116,12 +127,17 @@ dapr run --app-id orderprocessing --app-port 6001 --dapr-http-port 3601 --dapr-g
{{% codetab %}}
```java
-
//dependencies
import io.dapr.client.DaprClient;
import io.dapr.client.DaprClientBuilder;
import io.dapr.client.domain.State;
+import io.dapr.client.domain.TransactionalStateOperation;
+import org.springframework.boot.autoconfigure.SpringBootApplication;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
import reactor.core.publisher.Mono;
+import java.util.Random;
+import java.util.concurrent.TimeUnit;
//code
@SpringBootApplication
@@ -129,18 +145,23 @@ public class OrderProcessingServiceApplication {
private static final Logger log = LoggerFactory.getLogger(OrderProcessingServiceApplication.class);
- public static void main(String[] args) throws InterruptedException {
- String STATE_STORE_NAME = "statestore";
+ private static final String STATE_STORE_NAME = "statestore";
- int orderId = 100;
- //Using Dapr SDK to save and get state
- DaprClient client = new DaprClientBuilder().build();
- client.saveState(STATE_STORE_NAME, "order_1", Integer.toString(orderId)).block();
- client.saveState(STATE_STORE_NAME, "order_2", Integer.toString(orderId)).block();
- Mono> result = client.getState(STATE_STORE_NAME, "order_1", String.class);
+ public static void main(String[] args) throws InterruptedException{
+ while(true) {
+ TimeUnit.MILLISECONDS.sleep(5000);
+ Random random = new Random();
+ int orderId = random.nextInt(1000-1) + 1;
+ DaprClient client = new DaprClientBuilder().build();
+ //Using Dapr SDK to save and get state
+ client.saveState(STATE_STORE_NAME, "order_1", Integer.toString(orderId)).block();
+ client.saveState(STATE_STORE_NAME, "order_2", Integer.toString(orderId)).block();
+ Mono> result = client.getState(STATE_STORE_NAME, "order_1", String.class);
+ log.info("Result after get" + result);
+ }
}
-}
+}
```
Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
@@ -155,22 +176,26 @@ dapr run --app-id orderprocessing --app-port 6001 --dapr-http-port 3601 --dapr-g
{{% codetab %}}
```python
-
#dependencies
+import random
+from time import sleep
+import requests
+import logging
from dapr.clients import DaprClient
+from dapr.clients.grpc._state import StateItem
+from dapr.clients.grpc._request import TransactionalStateOperation, TransactionOperationType
#code
logging.basicConfig(level = logging.INFO)
-
DAPR_STORE_NAME = "statestore"
-
-orderId = 100
-#Using Dapr SDK to save and get state
-with DaprClient() as client:
- client.save_state(DAPR_STORE_NAME, "order_1", str(orderId))
- result = client.get_state(DAPR_STORE_NAME, "order_1")
- logging.info('Result after get: ' + str(result))
-
+while True:
+ sleep(random.randrange(50, 5000) / 1000)
+ orderId = random.randint(1, 1000)
+ with DaprClient() as client:
+ #Using Dapr SDK to save and get state
+ client.save_state(DAPR_STORE_NAME, "order_1", str(orderId))
+ result = client.get_state(DAPR_STORE_NAME, "order_1")
+ logging.info('Result after get: ' + result.data.decode('utf-8'))
```
Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
@@ -185,7 +210,6 @@ dapr run --app-id orderprocessing --app-port 6001 --dapr-http-port 3601 --dapr-g
{{% codetab %}}
```go
-
//dependencies
import (
"context"
@@ -194,34 +218,32 @@ import (
"time"
"strconv"
dapr "github.com/dapr/go-sdk/client"
-
)
//code
func main() {
-
- STATE_STORE_NAME := "statestore"
-
- orderId := 100
-
- //Using Dapr SDK to save and get state
- client, err := dapr.NewClient()
- if err != nil {
- panic(err)
- }
- defer client.Close()
- ctx := context.Background()
-
- if err := client.SaveState(ctx, STATE_STORE_NAME, "order_1", []byte(strconv.Itoa(orderId))); err != nil {
- panic(err)
- }
-
- result, err := client.GetState(ctx, STATE_STORE_NAME, "order_1")
- if err != nil {
- panic(err)
- }
+ for i := 0; i < 10; i++ {
+ time.Sleep(5000)
+ orderId := rand.Intn(1000-1) + 1
+ client, err := dapr.NewClient()
+ STATE_STORE_NAME := "statestore"
+ if err != nil {
+ panic(err)
+ }
+ defer client.Close()
+ ctx := context.Background()
+ //Using Dapr SDK to save and get state
+ if err := client.SaveState(ctx, STATE_STORE_NAME, "order_1", []byte(strconv.Itoa(orderId))); err != nil {
+ panic(err)
+ }
+ result, err := client.GetState(ctx, STATE_STORE_NAME, "order_2")
+ if err != nil {
+ panic(err)
+ }
+ log.Println("Result after get: ")
+ log.Println(result)
+ }
}
-
```
Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
@@ -236,19 +258,26 @@ dapr run --app-id orderprocessing --app-port 6001 --dapr-http-port 3601 --dapr-g
{{% codetab %}}
```javascript
-
//dependencies
import { DaprClient, HttpMethod, CommunicationProtocolEnum } from 'dapr-client';
//code
const daprHost = "127.0.0.1";
-
var main = function() {
- const STATE_STORE_NAME = "statestore";
+ for(var i=0;i<10;i++) {
+ sleep(5000);
+ var orderId = Math.floor(Math.random() * (1000 - 1) + 1);
+ start(orderId).catch((e) => {
+ console.error(e);
+ process.exit(1);
+ });
+ }
+}
- var orderId = 100;
- //Using Dapr SDK to save and get state
+async function start(orderId) {
const client = new DaprClient(daprHost, process.env.DAPR_HTTP_PORT, CommunicationProtocolEnum.HTTP);
+ const STATE_STORE_NAME = "statestore";
+ //Using Dapr SDK to save and get state
await client.state.save(STATE_STORE_NAME, [
{
key: "order_1",
@@ -260,10 +289,14 @@ var main = function() {
}
]);
var result = await client.state.get(STATE_STORE_NAME, "order_1");
+ console.log("Result after get: " + result);
+}
+
+function sleep(ms) {
+ return new Promise(resolve => setTimeout(resolve, ms));
}
main();
-
```
Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
@@ -329,7 +362,6 @@ Below are code examples that leverage Dapr SDKs for deleting the state.
{{% codetab %}}
```csharp
-
//dependencies
using Dapr.Client;
@@ -347,7 +379,6 @@ namespace EventService
}
}
}
-
```
Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
@@ -362,10 +393,10 @@ dapr run --app-id orderprocessing --app-port 6001 --dapr-http-port 3601 --dapr-g
{{% codetab %}}
```java
-
//dependencies
import io.dapr.client.DaprClient;
import io.dapr.client.DaprClientBuilder;
+import org.springframework.boot.autoconfigure.SpringBootApplication;
//code
@SpringBootApplication
@@ -379,7 +410,6 @@ public class OrderProcessingServiceApplication {
client.deleteState(STATE_STORE_NAME, "order_1", storedEtag, null).block();
}
}
-
```
Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
@@ -394,19 +424,16 @@ dapr run --app-id orderprocessing --app-port 6001 --dapr-http-port 3601 --dapr-g
{{% codetab %}}
```python
-
#dependencies
from dapr.clients.grpc._request import TransactionalStateOperation, TransactionOperationType
#code
logging.basicConfig(level = logging.INFO)
-
DAPR_STORE_NAME = "statestore"
#Using Dapr SDK to delete the state
with DaprClient() as client:
client.delete_state(store_name=DAPR_STORE_NAME, key="order_1")
-
```
Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
@@ -421,7 +448,6 @@ dapr run --app-id orderprocessing --app-port 6001 --dapr-http-port 3601 --dapr-g
{{% codetab %}}
```go
-
//dependencies
import (
"context"
@@ -431,9 +457,7 @@ import (
//code
func main() {
-
STATE_STORE_NAME := "statestore"
-
//Using Dapr SDK to delete the state
client, err := dapr.NewClient()
if err != nil {
@@ -446,7 +470,6 @@ func main() {
panic(err)
}
}
-
```
Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
@@ -461,23 +484,19 @@ dapr run --app-id orderprocessing --app-port 6001 --dapr-http-port 3601 --dapr-g
{{% codetab %}}
```javascript
-
//dependencies
import { DaprClient, HttpMethod, CommunicationProtocolEnum } from 'dapr-client';
//code
const daprHost = "127.0.0.1";
-
var main = function() {
const STATE_STORE_NAME = "statestore";
-
//Using Dapr SDK to save and get state
const client = new DaprClient(daprHost, process.env.DAPR_HTTP_PORT, CommunicationProtocolEnum.HTTP);
await client.state.delete(STATE_STORE_NAME, "order_1");
}
main();
-
```
Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
@@ -515,11 +534,11 @@ Below are code examples that leverage Dapr SDKs for saving and retrieving multip
{{% codetab %}}
```java
-
//dependencies
import io.dapr.client.DaprClient;
import io.dapr.client.DaprClientBuilder;
import io.dapr.client.domain.State;
+import java.util.Arrays;
//code
@SpringBootApplication
@@ -529,15 +548,12 @@ public class OrderProcessingServiceApplication {
public static void main(String[] args) throws InterruptedException{
String STATE_STORE_NAME = "statestore";
-
- int orderId = 100;
//Using Dapr SDK to retrieve multiple states
DaprClient client = new DaprClientBuilder().build();
Mono>> resultBulk = client.getBulkState(STATE_STORE_NAME,
Arrays.asList("order_1", "order_2"), String.class);
}
}
-
```
Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
@@ -548,27 +564,22 @@ dapr run --app-id orderprocessing --app-port 6001 --dapr-http-port 3601 --dapr-g
{{% /codetab %}}
-
{{% codetab %}}
```python
-
#dependencies
from dapr.clients import DaprClient
from dapr.clients.grpc._state import StateItem
#code
logging.basicConfig(level = logging.INFO)
-
DAPR_STORE_NAME = "statestore"
-
orderId = 100
#Using Dapr SDK to save and retrieve multiple states
with DaprClient() as client:
client.save_bulk_state(store_name=DAPR_STORE_NAME, states=[StateItem(key="order_2", value=str(orderId))])
result = client.get_bulk_state(store_name=DAPR_STORE_NAME, keys=["order_1", "order_2"], states_metadata={"metakey": "metavalue"}).items
logging.info('Result after get bulk: ' + str(result))
-
```
Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
@@ -579,21 +590,16 @@ dapr run --app-id orderprocessing --app-port 6001 --dapr-http-port 3601 --dapr-g
{{% /codetab %}}
-
-
{{% codetab %}}
```javascript
-
//dependencies
import { DaprClient, HttpMethod, CommunicationProtocolEnum } from 'dapr-client';
//code
const daprHost = "127.0.0.1";
-
var main = function() {
const STATE_STORE_NAME = "statestore";
-
var orderId = 100;
//Using Dapr SDK to save and retrieve multiple states
const client = new DaprClient(daprHost, process.env.DAPR_HTTP_PORT, CommunicationProtocolEnum.HTTP);
@@ -611,7 +617,6 @@ var main = function() {
}
main();
-
```
Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
@@ -662,9 +667,16 @@ Below are code examples that leverage Dapr SDKs for performing state transaction
{{% codetab %}}
```csharp
-
//dependencies
+using System;
+using System.Collections.Generic;
+using System.Net.Http;
+using System.Net.Http.Headers;
+using System.Threading.Tasks;
using Dapr.Client;
+using Microsoft.AspNetCore.Mvc;
+using System.Threading;
+using System.Text.Json;
//code
namespace EventService
@@ -674,22 +686,26 @@ namespace EventService
static async Task Main(string[] args)
{
string DAPR_STORE_NAME = "statestore";
-
- int orderId = 100;
- //Using Dapr SDK to perform the state transactions
- using var client = new DaprClientBuilder().Build();
- var requests = new List()
- {
- new StateTransactionRequest("order_3", JsonSerializer.SerializeToUtf8Bytes(orderId.ToString()), StateOperationType.Upsert),
- new StateTransactionRequest("order_2", null, StateOperationType.Delete)
- };
- CancellationTokenSource source = new CancellationTokenSource();
- CancellationToken cancellationToken = source.Token;
- await client.ExecuteStateTransactionAsync(DAPR_STORE_NAME, requests, cancellationToken: cancellationToken);
+ while(true) {
+ System.Threading.Thread.Sleep(5000);
+ Random random = new Random();
+ int orderId = random.Next(1,1000);
+ using var client = new DaprClientBuilder().Build();
+ var requests = new List()
+ {
+ new StateTransactionRequest("order_3", JsonSerializer.SerializeToUtf8Bytes(orderId.ToString()), StateOperationType.Upsert),
+ new StateTransactionRequest("order_2", null, StateOperationType.Delete)
+ };
+ CancellationTokenSource source = new CancellationTokenSource();
+ CancellationToken cancellationToken = source.Token;
+ //Using Dapr SDK to perform the state transactions
+ await client.ExecuteStateTransactionAsync(DAPR_STORE_NAME, requests, cancellationToken: cancellationToken);
+ Console.WriteLine("Order requested: " + orderId);
+ Console.WriteLine("Result: " + result);
+ }
}
}
}
-
```
Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
@@ -704,13 +720,19 @@ dapr run --app-id orderprocessing --app-port 6001 --dapr-http-port 3601 --dapr-g
{{% codetab %}}
```java
-
//dependencies
import io.dapr.client.DaprClient;
import io.dapr.client.DaprClientBuilder;
import io.dapr.client.domain.State;
import io.dapr.client.domain.TransactionalStateOperation;
-
+import org.springframework.boot.autoconfigure.SpringBootApplication;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+import reactor.core.publisher.Mono;
+import java.util.ArrayList;
+import java.util.List;
+import java.util.Random;
+import java.util.concurrent.TimeUnit;
//code
@SpringBootApplication
@@ -718,21 +740,26 @@ public class OrderProcessingServiceApplication {
private static final Logger log = LoggerFactory.getLogger(OrderProcessingServiceApplication.class);
+ private static final String STATE_STORE_NAME = "statestore";
+
public static void main(String[] args) throws InterruptedException{
- String STATE_STORE_NAME = "statestore";
-
- int orderId = 100;
- //Using Dapr SDK to perform the state transactions
- DaprClient client = new DaprClientBuilder().build();
- List> operationList = new ArrayList<>();
- operationList.add(new TransactionalStateOperation<>(TransactionalStateOperation.OperationType.UPSERT,
- new State<>("order_3", Integer.toString(orderId), "")));
- operationList.add(new TransactionalStateOperation<>(TransactionalStateOperation.OperationType.DELETE,
- new State<>("order_2")));
- client.executeStateTransaction(STATE_STORE_NAME, operationList).block();
+ while(true) {
+ TimeUnit.MILLISECONDS.sleep(5000);
+ Random random = new Random();
+ int orderId = random.nextInt(1000-1) + 1;
+ DaprClient client = new DaprClientBuilder().build();
+ List> operationList = new ArrayList<>();
+ operationList.add(new TransactionalStateOperation<>(TransactionalStateOperation.OperationType.UPSERT,
+ new State<>("order_3", Integer.toString(orderId), "")));
+ operationList.add(new TransactionalStateOperation<>(TransactionalStateOperation.OperationType.DELETE,
+ new State<>("order_2")));
+ //Using Dapr SDK to perform the state transactions
+ client.executeStateTransaction(STATE_STORE_NAME, operationList).block();
+ log.info("Order requested: " + orderId);
+ }
}
-}
+}
```
Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
@@ -743,37 +770,42 @@ dapr run --app-id orderprocessing --app-port 6001 --dapr-http-port 3601 --dapr-g
{{% /codetab %}}
-
{{% codetab %}}
-
```python
-
#dependencies
+import random
+from time import sleep
+import requests
+import logging
from dapr.clients import DaprClient
from dapr.clients.grpc._state import StateItem
from dapr.clients.grpc._request import TransactionalStateOperation, TransactionOperationType
#code
-logging.basicConfig(level = logging.INFO)
-
+logging.basicConfig(level = logging.INFO)
DAPR_STORE_NAME = "statestore"
+while True:
+ sleep(random.randrange(50, 5000) / 1000)
+ orderId = random.randint(1, 1000)
+ with DaprClient() as client:
+ #Using Dapr SDK to perform the state transactions
+ client.execute_state_transaction(store_name=DAPR_STORE_NAME, operations=[
+ TransactionalStateOperation(
+ operation_type=TransactionOperationType.upsert,
+ key="order_3",
+ data=str(orderId)),
+ TransactionalStateOperation(key="order_3", data=str(orderId)),
+ TransactionalStateOperation(
+ operation_type=TransactionOperationType.delete,
+ key="order_2",
+ data=str(orderId)),
+ TransactionalStateOperation(key="order_2", data=str(orderId))
+ ])
-orderId = 100
-#Using Dapr SDK to perform the state transactions
-with DaprClient() as client:
- client.execute_state_transaction(store_name=DAPR_STORE_NAME, operations=[
- TransactionalStateOperation(
- operation_type=TransactionOperationType.upsert,
- key="order_3",
- data=str(orderId)),
- TransactionalStateOperation(key="order_3", data=str(orderId)),
- TransactionalStateOperation(
- operation_type=TransactionOperationType.delete,
- key="order_2",
- data=str(orderId)),
- TransactionalStateOperation(key="order_2", data=str(orderId))
- ])
-
+ client.delete_state(store_name=DAPR_STORE_NAME, key="order_1")
+ logging.basicConfig(level = logging.INFO)
+ logging.info('Order requested: ' + str(orderId))
+ logging.info('Result: ' + str(result))
```
Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
@@ -788,38 +820,48 @@ dapr run --app-id orderprocessing --app-port 6001 --dapr-http-port 3601 --dapr-g
{{% codetab %}}
```javascript
-
//dependencies
import { DaprClient, HttpMethod, CommunicationProtocolEnum } from 'dapr-client';
//code
const daprHost = "127.0.0.1";
-
var main = function() {
- const STATE_STORE_NAME = "statestore";
+ for(var i=0;i<10;i++) {
+ sleep(5000);
+ var orderId = Math.floor(Math.random() * (1000 - 1) + 1);
+ start(orderId).catch((e) => {
+ console.error(e);
+ process.exit(1);
+ });
+ }
+}
- var orderId = 100;
- //Using Dapr SDK to save and retrieve multiple states
+async function start(orderId) {
const client = new DaprClient(daprHost, process.env.DAPR_HTTP_PORT, CommunicationProtocolEnum.HTTP);
+ const STATE_STORE_NAME = "statestore";
+ //Using Dapr SDK to save and retrieve multiple states
await client.state.transaction(STATE_STORE_NAME, [
{
- operation: "upsert",
- request: {
- key: "order_3",
- value: orderId.toString()
- }
+ operation: "upsert",
+ request: {
+ key: "order_3",
+ value: orderId.toString()
+ }
},
{
- operation: "delete",
- request: {
- key: "order_2"
- }
+ operation: "delete",
+ request: {
+ key: "order_2"
+ }
}
]);
}
-main();
+function sleep(ms) {
+ return new Promise(resolve => setTimeout(resolve, ms));
+}
+main();
```
Navigate to the directory containing the above code, then run the following command to launch a Dapr sidecar and run the application:
diff --git a/daprdocs/content/en/developing-applications/integrations/autoscale-keda.md b/daprdocs/content/en/developing-applications/integrations/autoscale-keda.md
index e6fd74149..7e685010f 100644
--- a/daprdocs/content/en/developing-applications/integrations/autoscale-keda.md
+++ b/daprdocs/content/en/developing-applications/integrations/autoscale-keda.md
@@ -6,7 +6,7 @@ description: "How to configure your Dapr application to autoscale using KEDA"
weight: 2000
---
-Dapr, with its modular building-block approach, along with the 10+ different [pub/sub components]({{< ref pubsub >}}), make it easy to write message processing applications. Since Dapr can run in many environments (e.g. VM, bare-metal, Cloud, or Edge) the autoscaling of Dapr applications is managed by the hosting later.
+Dapr, with its modular building-block approach, along with the 10+ different [pub/sub components]({{< ref pubsub >}}), make it easy to write message processing applications. Since Dapr can run in many environments (e.g. VM, bare-metal, Cloud, or Edge) the autoscaling of Dapr applications is managed by the hosting layer.
For Kubernetes, Dapr integrates with [KEDA](https://github.com/kedacore/keda), an event driven autoscaler for Kubernetes. Many of Dapr's pub/sub components overlap with the scalers provided by [KEDA](https://github.com/kedacore/keda) so it's easy to configure your Dapr deployment on Kubernetes to autoscale based on the back pressure using KEDA.
diff --git a/daprdocs/content/en/getting-started/quickstarts.md b/daprdocs/content/en/getting-started/quickstarts.md
index bfb892117..de6a3313c 100644
--- a/daprdocs/content/en/getting-started/quickstarts.md
+++ b/daprdocs/content/en/getting-started/quickstarts.md
@@ -6,7 +6,7 @@ weight: 60
description: "Tutorials with code samples that are aimed to get you started quickly with Dapr"
---
-The [Dapr Quickstarts](https://github.com/dapr/quickstarts/tree/v1.0.0) are a collection of tutorials with code samples that are aimed to get you started quickly with Dapr, each highlighting a different Dapr capability.
+The [Dapr Quickstarts](https://github.com/dapr/quickstarts/tree/v1.5.0) are a collection of tutorials with code samples that are aimed to get you started quickly with Dapr, each highlighting a different Dapr capability.
- A good place to start is the hello-world quickstart, it demonstrates how to run Dapr in standalone mode locally on your machine and demonstrates state management and service invocation in a simple application.
- Next, if you are familiar with Kubernetes and want to see how to run the same application in a Kubernetes environment, look for the hello-kubernetes quickstart. Other quickstarts such as pub-sub, bindings and the distributed-calculator quickstart explore different Dapr capabilities include instructions for running both locally and on Kubernetes and can be completed in any order. A full list of the quickstarts can be found below.
@@ -17,12 +17,12 @@ The [Dapr Quickstarts](https://github.com/dapr/quickstarts/tree/v1.0.0) are a co
| Quickstart | Description |
|--------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
-| [Hello World](https://github.com/dapr/quickstarts/tree/v1.4.0/hello-world) | Demonstrates how to run Dapr locally. Highlights service invocation and state management. |
-| [Hello Kubernetes](https://github.com/dapr/quickstarts/tree/v1.4.0/hello-kubernetes) | Demonstrates how to run Dapr in Kubernetes. Highlights service invocation and state management. |
-| [Distributed Calculator](https://github.com/dapr/quickstarts/tree/v1.4.0/distributed-calculator) | Demonstrates a distributed calculator application that uses Dapr services to power a React web app. Highlights polyglot (multi-language) programming, service invocation and state management. |
-| [Pub/Sub](https://github.com/dapr/quickstarts/tree/v1.4.0/pub-sub) | Demonstrates how to use Dapr to enable pub-sub applications. Uses Redis as a pub-sub component. |
-| [Bindings](https://github.com/dapr/quickstarts/tree/v1.4.0/bindings) | Demonstrates how to use Dapr to create input and output bindings to other components. Uses bindings to Kafka. |
-| [Middleware](https://github.com/dapr/quickstarts/tree/v1.4.0/middleware) | Demonstrates use of Dapr middleware to enable OAuth 2.0 authorization. |
-| [Observability](https://github.com/dapr/quickstarts/tree/v1.4.0/observability) | Demonstrates Dapr tracing capabilities. Uses Zipkin as a tracing component. |
-| [Secret Store](https://github.com/dapr/quickstarts/tree/v1.4.0/secretstore) | Demonstrates the use of Dapr Secrets API to access secret stores. |
+| [Hello World](https://github.com/dapr/quickstarts/tree/v1.5.0/hello-world) | Demonstrates how to run Dapr locally. Highlights service invocation and state management. |
+| [Hello Kubernetes](https://github.com/dapr/quickstarts/tree/v1.5.0/hello-kubernetes) | Demonstrates how to run Dapr in Kubernetes. Highlights service invocation and state management. |
+| [Distributed Calculator](https://github.com/dapr/quickstarts/tree/v1.5.0/distributed-calculator) | Demonstrates a distributed calculator application that uses Dapr services to power a React web app. Highlights polyglot (multi-language) programming, service invocation and state management. |
+| [Pub/Sub](https://github.com/dapr/quickstarts/tree/v1.5.0/pub-sub) | Demonstrates how to use Dapr to enable pub-sub applications. Uses Redis as a pub-sub component. |
+| [Bindings](https://github.com/dapr/quickstarts/tree/v1.5.0/bindings) | Demonstrates how to use Dapr to create input and output bindings to other components. Uses bindings to Kafka. |
+| [Middleware](https://github.com/dapr/quickstarts/tree/v1.5.0/middleware) | Demonstrates use of Dapr middleware to enable OAuth 2.0 authorization. |
+| [Observability](https://github.com/dapr/quickstarts/tree/v1.5.0/observability) | Demonstrates Dapr tracing capabilities. Uses Zipkin as a tracing component. |
+| [Secret Store](https://github.com/dapr/quickstarts/tree/v1.5.0/secretstore) | Demonstrates the use of Dapr Secrets API to access secret stores. |
diff --git a/daprdocs/content/en/operations/monitoring/tracing/supported-tracing-backends/jaeger.md b/daprdocs/content/en/operations/monitoring/tracing/supported-tracing-backends/jaeger.md
index 44a453649..e4e6b4903 100644
--- a/daprdocs/content/en/operations/monitoring/tracing/supported-tracing-backends/jaeger.md
+++ b/daprdocs/content/en/operations/monitoring/tracing/supported-tracing-backends/jaeger.md
@@ -83,7 +83,7 @@ spec:
#### Production
-Jaeger uses Elasticsearch as the backend storage, and you can create a secret in k8s cluster to access Elasticsearch server with access control. See [Configuring and Deploying Jaeger](https://docs.openshift.com/container-platform/4.7/jaeger/jaeger_install/rhbjaeger-deploying.html)
+Jaeger uses Elasticsearch as the backend storage, and you can create a secret in k8s cluster to access Elasticsearch server with access control. See [Configuring and Deploying Jaeger](https://docs.openshift.com/container-platform/4.9/distr_tracing/distr_tracing_install/distr-tracing-deploying.html)
```shell
kubectl create secret generic jaeger-secret --from-literal=ES_PASSWORD='xxx' --from-literal=ES_USERNAME='xxx' -n ${NAMESPACE}
diff --git a/daprdocs/content/en/operations/security/oauth.md b/daprdocs/content/en/operations/security/oauth.md
index 5fedd524d..ef663cbff 100644
--- a/daprdocs/content/en/operations/security/oauth.md
+++ b/daprdocs/content/en/operations/security/oauth.md
@@ -34,12 +34,14 @@ To figure the Dapr OAuth middleware, you'll need to collect the following inform
Authorization/Token URLs of some of the popular authorization servers:
+
| Server | Authorization URL | Token URL |
|---------|-------------------|-----------|
|Azure AAD|||
|GitHub|||
|Google|||
|Twitter|||
+
## Define the middleware component definition
diff --git a/daprdocs/content/en/operations/support/support-release-policy.md b/daprdocs/content/en/operations/support/support-release-policy.md
index 318036117..6bd27c761 100644
--- a/daprdocs/content/en/operations/support/support-release-policy.md
+++ b/daprdocs/content/en/operations/support/support-release-policy.md
@@ -45,7 +45,9 @@ The table below shows the versions of Dapr releases that have been tested togeth
| Sep 22nd 2021 | 1.4.1 | 1.4.0 | Java 1.3.0 Go 1.2.0 PHP 1.1.0 Python 1.3.0 .NET 1.4.0 | 0.8.0 | Supported
| Sep 24th 2021 | 1.4.2 | 1.4.0 | Java 1.3.0 Go 1.2.0 PHP 1.1.0 Python 1.3.0 .NET 1.4.0 | 0.8.0 | Supported |
| Oct 7th 2021 | 1.4.3 | 1.4.0 | Java 1.3.0 Go 1.2.0 PHP 1.1.0 Python 1.3.0 .NET 1.4.0 | 0.8.0 | Supported |
-| Nov 11th 2021 | 1.5.0 | 1.5.0 | Java 1.3.0 Go 1.3.0 PHP 1.1.0 Python 1.4.0 .NET 1.5.0 JS 1.0.2 | 0.9.0 | Supported (current) |
+| Dev 6th 2021 | 1.4.4 | 1.4.0 | Java 1.3.0 Go 1.2.0 PHP 1.1.0 Python 1.3.0 .NET 1.4.0 | 0.8.0 | Supported |
+| Nov 11th 2021 | 1.5.0 | 1.5.0 | Java 1.3.0 Go 1.3.0 PHP 1.1.0 Python 1.4.0 .NET 1.5.0 JS 1.0.2 | 0.9.0 | Supported (current) |
+| Dec 6th 2021 | 1.5.1 | 1.5.1 | Java 1.3.0 Go 1.3.0 PHP 1.1.0 Python 1.4.0 .NET 1.5.0 JS 1.0.2 | 0.9.0 | Supported (current) |
## Upgrade paths
After the 1.0 release of the runtime there may be situations where it is necessary to explicitly upgrade through an additional release to reach the desired target. For example an upgrade from v1.0 to v1.2 may need go pass through v1.1
@@ -59,23 +61,22 @@ General guidance on upgrading can be found for [self hosted mode]({{}})
+* [How to setup logging in Dapr]({{< ref "logging.md" >}})
diff --git a/daprdocs/content/en/reference/api/service_invocation_api.md b/daprdocs/content/en/reference/api/service_invocation_api.md
index 94888067b..15de82351 100644
--- a/daprdocs/content/en/reference/api/service_invocation_api.md
+++ b/daprdocs/content/en/reference/api/service_invocation_api.md
@@ -16,7 +16,7 @@ This endpoint lets you invoke a method in another Dapr enabled app.
### HTTP Request
```
-POST/GET/PUT/DELETE http://localhost:/v1.0/invoke//method/
+PATCH/POST/GET/PUT/DELETE http://localhost:/v1.0/invoke//method/
```
### HTTP Response codes
diff --git a/daprdocs/content/en/reference/components-reference/supported-bindings/s3.md b/daprdocs/content/en/reference/components-reference/supported-bindings/s3.md
index b48aa37d2..68754b50a 100644
--- a/daprdocs/content/en/reference/components-reference/supported-bindings/s3.md
+++ b/daprdocs/content/en/reference/components-reference/supported-bindings/s3.md
@@ -27,6 +27,8 @@ spec:
value: mybucket
- name: region
value: us-west-2
+ - name: endpoint
+ value: s3-us-west-2.amazonaws.com
- name: accessKey
value: *****************
- name: secretKey
@@ -37,6 +39,8 @@ spec:
value:
- name: encodeBase64
value:
+ - name: forcePathStyle
+ value:
```
{{% alert title="Warning" color="warning" %}}
@@ -49,9 +53,11 @@ The above example uses secrets as plain strings. It is recommended to use a secr
|--------------------|:--------:|------------|-----|---------|
| bucket | Y | Output | The name of the S3 bucket to write to | `"bucket"` |
| region | Y | Output | The specific AWS region | `"us-east-1"` |
+| endpoint | N | Output | The specific AWS endpoint | `"s3-us-east-1.amazonaws.com"` |
| accessKey | Y | Output | The AWS Access Key to access this resource | `"key"` |
| secretKey | Y | Output | The AWS Secret Access Key to access this resource | `"secretAccessKey"` |
| sessionToken | N | Output | The AWS session token to use | `"sessionToken"` |
+| forcePathStyle | N | Output | Currently Amazon S3 SDK supports virtual hosted-style and path-style access. `true` is path-style format like `https:////`. `false` is hosted-style format like `https://./`. Defaults to `false` | `true`, `false` |
| decodeBase64 | N | Output | Configuration to decode base64 file content before saving to bucket storage. (In case of saving a file with binary content). `true` is the only allowed positive value. Other positive variations like `"True", "1"` are not acceptable. Defaults to `false` | `true`, `false` |
| encodeBase64 | N | Output | Configuration to encode base64 file content before return the content. (In case of opening a file with binary content). `true` is the only allowed positive value. Other positive variations like `"True", "1"` are not acceptable. Defaults to `false` | `true`, `false` |
@@ -136,6 +142,8 @@ spec:
value: mybucket
- name: region
value: us-west-2
+ - name: endpoint
+ value: s3-us-west-2.amazonaws.com
- name: accessKey
value: *****************
- name: secretKey
@@ -144,6 +152,8 @@ spec:
value: mysession
- name: decodeBase64
value:
+ - name: forcePathStyle
+ value:
```
Then you can upload it as you would normally:
diff --git a/daprdocs/layouts/shortcodes/dapr-latest-version.html b/daprdocs/layouts/shortcodes/dapr-latest-version.html
index 60da16a51..77638b68f 100644
--- a/daprdocs/layouts/shortcodes/dapr-latest-version.html
+++ b/daprdocs/layouts/shortcodes/dapr-latest-version.html
@@ -1 +1 @@
-{{- if .Get "short" }}1.5{{ else if .Get "long" }}1.5.0{{ else if .Get "cli" }}1.5.0{{ else }}1.5.0{{ end -}}
+{{- if .Get "short" }}1.5{{ else if .Get "long" }}1.5.1{{ else if .Get "cli" }}1.5.1{{ else }}1.5.1{{ end -}}
diff --git a/daprdocs/static/images/building-block-input-binding-example.png b/daprdocs/static/images/building-block-input-binding-example.png
new file mode 100644
index 0000000000000000000000000000000000000000..408c373786918a32c28bfd2005d5894b630b8162
GIT binary patch
literal 88366
zcmaHT2Rzl^|G#t>qdj@)%$BvvlLPbS&?CO4BAzFM8X{tx9{0!4{^&v5A6~}w=nZ?a)jAIvq>Dr`tza*{lI1Nv3%0_M0j~DY7ce$wz
zs%ywxmGIx*H+!@%9$5e50#v@-F!IB@0FAx!s9K*hnj%tcQxd4lI>k?i88>1#J6|2V
zZFKj?yVH5+UAvMh>~^UR+H=6q$sMtGh-q!Fm+luuU&ab&9h9~d|0ZV$7hS*0^kX1u@^X?lY)z<+P+b5Pd1SvF=(jrCpR#*c8bDSof_Uql9D_P#!*=Q`ezNuO!IGr4fMdQuMd
z;-0C?^c~)eM|V=&r0j*IA9Ctd;o?`+#-WK(BetXCH=*PHQi)#ccb)Fazjb(fOkMeF
z&uPPopV7jacl_Q#OA9qRHmaLF`0|B#BMz(iy^E8*dFspEU^P{k3T@E(UGo}IPmOEo
zsyFw=?<{9WbtWvTUw*$mZTs?bqU>32IhAMxYxQ0uyb~cCHQ&v9a;smJ{Ca-P4oZ3^
zSVOBE^SE8lmRND7m{+QTfH>$!r(D>4v1nngdH&>2c2!1whH0kVR_OH^W5cT%+VR@?
zc&8aeqpf+PJ)uHK8>_8h_BP}UeJBq^Cn24vk~KFwHGAatl*O|UbOZ8e$VJAENUv8@
z{#1|^GMSv&IQR6cwQ3cW=f2bAts2}HH+AT3AEZism0Ce78wYiWw)5M^jJCVLIUaF=lat2&g1{9T7Q2+ln~$|yOr%)4XasYQ!VbGN2bG_2Klr-Y`X<%joijJ7
zvkq{yu-!a3ey1q^;2X&2FXz8qWW0N~k5S1!CY^sc=G0WwCSUj!PUiTwQw3L)9!AD6
zDax>9L?`o$Js`a^eSUI_Mlm8wt>=`-JDzE#5BJ_&fIs2X((pB889c-jzMTt`JED6@
zDA(*8YZc?;LCOQi9tYdmbe7pW6%`2BF?`$eQTJk2
zFigt&;N_~|Z*XFsqaY^98g(|_I)`oPgt}{J+68xU0zn_w4
z8H>@6Ie(h@{r%}HNBG(w);$j~qI2PKIX$6D=?H0eo;V0C2zx$n!1Ka=KA2anWamGJVlT58!iRct_Lo0D4pivdMw4$*c0;DLrKUtyHEO+;A354
zKe4}dKz~5_*1cCXued+Ef0p^&@T%n1=OCpZ-k|qE13yJc3G>ymQc=h5w7x^a9Jp=A
zZ-jg}*Y7k(|C3{3WZq_I-&U%NZ&c0Jz_q;hOH_-{MXX)AUBcHRtwF7aFV{V|)6e;=
z;Bsc19+&U20iIdaxO2s~M#Js=;|(1S69Nb)I>zE8Pi&;8Kf08DMQ1o8OKwgEnSLnK
zF|$lQT#KZ^{r_?ky9*BpFD9=R_$h*w#=p@+Bc#FyyCcB^LwWpdOUs^K0}
zA3Hfl_LF$O%dMZ?DkJsQ2UdWl&ldis0Xdq_QHml5$-xGL=1J~G)@saSbi)w~|}8A9>Qm@j_%
zw%yU)5>GUp^ZWdGmiIdx*U++C9sMl2*&zw%q5q2L+#8uKo^A8yO~Z5RYU{@nrW11$
zTyMO#zHPT{!B!QC+AGIa9}~u=$^E@}$;sv0lQw&Ldy0FncWvUTctkks(bGv(7%I=^
z;oIwKjt;k9r
zPb+@R8dQD_e>wl8No%pfE><;b+A?IhgJ_&3Ixq$14F!he?%wuC8vliIcUZ`KD%yTXjmv*
z%^#nfnpJbE#L05B?kqv6BRE0wM3_&+(p?vp9(}n->B_6$PIq4&zj}CpZcb$G^?<66
zL;bb!?4x=eQE{HUnYF1K520N)-OH>HTGJ!hPKR!kc7;4(yI(C@WXyRQ#^HBw^eO4-
z$j1jCGb>f6+h!NL=6ZGCe@@F!P0RMbRD7z3Dd&;9QsAd)`FqcvU-|%V_v(Bk^7(`C
z9K#8LOg^`FBay14v}wN|G(R+H!f8@y8jO?jd9$$Rbyb!=+=`yV^mD~wD1iy3c
z84|nIRWA$ew(tI=#Fg<_`ZY3PCTq(_DOMoXuRE`+JA-Udp}pEU<-cY$d$#9XhK=0*
z`-m8tvw|lDKRmRc-{tVMzAMQr?rOEz81Q+jEN|xnJyHTWgUm@=NwwU3Ps)$Y*-mqm
z@UWV#b|3Jb@?QOsDHdV6XkO6p^5#s^wP)9dN*qfitad%SpPwb-&v%|H%PUSP^C*ic
zbw)T6b$I~k};%w7#HT|D-A
z%wWOfwJ+4c=3{8NP2+x|&4;l1K)!i3{Dqu5)zQVrJrs0QEzP3`b;}k?IfiS`J(sWI
zIpO;}ufJ(^rL3pvySt26p-Q3ut&h^368|HKxzVp|{aKkykVIe0q5Jsj
z=4OY~shbCHMYQND0*?h*dCvaP8G
zG_NXM)tNRgBohagy>}`O$I_}aC+;>;>E$MV8Z+o}1nEaC5^>d0SC>i@{HCRX95_iu
z1AZL&S4F+toy=;|mrczHMwQ5}*WrdLVDv@dbFed-^B^oEP|Yg%bEp
zIV>r__v?~dZs!FMx(0j~y}Xfp@)A-KQUdTJe0+Q=-ZvbTj4oaN<96_`^8(JdZrxOp
zl=S!am++UB@bY$&lvY$!l$4T@l#vk!SBU!rdfu`N5cl*s^?Q&%$GL>`aqxDzdCSGi
zlaDg4oxPXut@8o`l!^ZO`%Nb@z~%3mJbnIH7FeJp<&31Xgp}l8V}n~&C`XkHTmq2p
zW|v$%0MEc2a5-5im0#EY>&)LX{^L%>-*-yOo|XUSt^YXn|KEDe2kCv$%LB}L3;wsf
z{28R
zZ^}1#SGc$wK6{6XN{#C3r3)qj2NoL+r=Py=yN5oLAK!Vv)a_lH
z?B!3owV|WmYtd`$({p}f+@d!0vxODudr4vGc{^!w0Y2*g@fsFvF`1-hVm$C)U+F@8
z;1bAxJr8aj)mDorXgu;irV%JlyA3f$vt3ey{MVPQ#Tt6wmS0SIM
zAK{W6@A@BvU>klLGC;%X4NfWD20wSAgU_&!i7Zp9couIn3QK2p6(bRkC)W4}k*i$cwBCf!!rX(Zeg{bkD)Nr{-
z>YR_okb$F|ksAL}9n-@H=-7$~KEu|rYOh)PDqf)B{D5g|X3nSoO|D&=-gDeM@|X2Fmy1
zbYCeh(=@KgyEm=&bjz*4Ew_b#&j^M;@$`n+r3lA3w|k7Pvfv85TCP>1atY^%%oklx
zqX-vfOvJzM6U$ILC2->eQ|^Pk>oV8ltss|dku1)n)e@b3vG82TK+1}UOWeP2^JQ|r
z)K~1=GI?dvcj@j!3cGS?Am)K_MMM7de))n{KuhRMegA`$EFtV-&Z+f%Htv)7cJ(nf
zphn@W;ikOHoj)I+G*)guonAAL|D@vIRyn9^TOr{1{b^{$Zu>zmQ&m)xCWrpYTg;}0Bvwlv=OMLjHh
z997~TJUz4RJYAQ%P??PqN0vec@}8G?-1`?u$i=HQa4Z+luLqG|oSXEjoT$+~n3q6N
z=7(yha2%60N3%>q8*o)7P-IO5w<5Dn+MQQ5BL7m^KF07Mi9sKZInvMYmf_6hS_`kS
zs4p)nXk(fS`TO3XI%>ra!qRMdSH(v(GnOQg3ICNIo!o4Uk2}mi+*)6iG~F>)TWX!zJyS`-Hu|&GM&^p4U=v|@BztAKc%OdP!m;LI6(@WL#6Hs%S}2tq0tN78y+*iM3-<|Rbs|FWSp3$D6thXnYiJjSiRj_y@r-dlWe(F
z|IY38zXg~_H~4DhvSvMIS;oU{X0y?4X8UoW&`52PB!#^s#ueNP^Q92a5{Km8Zc-+`
zVCdX7zPj7PtzZgBA-1aK)W4)qE=-MS_$C2XtmoN>{1haes}t&J5&Mfmo!kimrZZ%R%G^w&s%3zqs;$l?_H0Zq6~vdFi{r
zYtu1vQaulBfI+byHM5O;qqED
zH7?yjY+Ev(u?&^S9+M#VR6Fb%nbCo=pZOTmzs45Uu&v-2toN%A@RTCWl0!SINQxzl
zheP&5-G2Z(ON~-1+P6xZYtqq6xZL33vsE8Z;$Tv0<@u&020%5UE>xS3^bQ%LFocV1yQ_M2%q>r{r7GptQ0@OeL5!Le#G9k9
z9z*PA1x?(9ieMP=SDiC$tAmJz6ZAfr9)juenJ>0k{!*cEB@ii<
zu%v$gdIE5?8soTF-t?|#W_F4f(Vz%WhqH5f6TUkdVgcOMYymj6oRM#aTYRP
z@7yzkor+nFP`(B|xUM)MIiM+u+xS|`hWj{7e7;j>C8{Y6ywti-`DoZ$y&qh!>C>t}
zN-O8LKvm5o#Zd+{9aho&7he3SUxON80;{@Y7@+V8kpKD{TAU}*+Z&Zw_<(zII5B#c
z9MpE71?OEK7mTSXSlZ;?0>XWKHFvRTe_UF?cjxNNwu4C_?=8S&c+YBt$l6qv$#`Nk
zsl2`hp3m0km|hc6xXt(>LHQ6AS;+IaBk!{Arw~Mh<-CfQ9BmX{lC@D
z;0_4YPTn>V2g_Q8bXp1%y^T<9u2y%0Uo4&)DK#J8Ncwb!ydJn>jH>A!!l7SK)*lFQ
z;N~&4LUsUl2x^igH%0l>icB?MbcMHTDB@9M6O|sP9Cth>WKzCo6}dd(y5K{>
z(&7+mcd&Py}5}!P@XQ_v1w(@MA
z_lC!l`fSL+><<&KfAIPoNKkxF86O0i$H0I8m2i(0W4^bs^wR>M!;Tao63auEbL=IBIr`l&ws3fQ6-Fb7>K(~4X)+pE;Zk|8Q1u_SkOUM`DO`jSsB5sivx4~K
z@3h~{;!6A2F=oj?sCfMNu@$2Rf7H`ZM@L8g*#$$~DnAFD+{$U6Q;&KKl|)abR=Czo
z?6{%}^uQ2vw3`I=jFREb4U&SyruX8rYrNr+J=_j*Z2
z9W*aN1P@0L@zn_^c-m5?3xMz>?DpTuRk#M@ecM;G&aMK~0_FB}gFZ})J`FizTY$1E
z=i@n*nsy>_diwetzQ)GJ^wP6Wj*)Y_<0Hpx&!jz#UK$VEhNm~jwumR$q%5HSHZHa3RDNR#Q8=eP6qF`P8UHpf+KH)|${dnqny
zc$d=>EmcfGt{{&2Gc(ua<7=s+V;qVES`o`{(5q42OM?5eOIzFK&p+V68P00)v)YT8
z#^jZ=Y6F;X%z6n;o~z6m^qZ1?d5lmX-}M!~Wl1sYqG+}6i2F+Nd{H|lJTcM$xngpP
zadAe;{14UnBAt)mUvlVvJWdu>LX82~CP#P2_iqr;GHPRkQ$EI|_>R-*@>&K<0D=E7
zwA>3owYVTJW+_GsQW=3!n;opTk@WX~QVBNAxTiwJ9NLE2>FL4{qZllsPGB(n$7s(6
zh7;GLfaioda2M4UyfW#krTD^fB_K@N4Fxvsp;d*qA;}{51)!+n7{jB0tZ=ZpotBoC
zBTOlQ?zqN!^g0(kHjb>aOx(6(*oZ`aGEp8r+X|AlLmDLhzu9&Si1QPb%2^O*B2XZ*
z-I<-wWO9NsD%PvhMheIJdV9ZQylTgvZm*H`RCJ6=DFn7yDSWy!ZeTlPxstagj`6NM1OJgc6aNV@vb$SX%GSQna`)%dgD3M8aNh>Y1~E)IBweSYQJ
zSobC0+t5rbTX}fqlel~N1!(NTHe?1$iH6b=0=luSMY
zD3KEt1Xg~FDtz;4_UksxlI5=bl>42=Ipf8mzXIh|pwc8E?&YBitTvF&pe|0c
zU#BH`EO({`5rv3Nk<~#hoY(lUy1J?k(>Sq5fbIv#r~OK<69jLzvt1wU>oZtgG>Un?
zV#l9t-y!0d<{eeQp?#i_8@GDYn75apWP{D=)Zihgc*^Fal3?e?=2LeZS+0fsO@e_C
z1(2zYUr}sG@c!$Sc81ldb14LecWcOHmZJ{kDEM8Yw!q-?jv4Aw@sUPHUJ$c`C0|>MM%9CeF4q0D{BE?z-^P*h=|gF<9{l?P6kX+hMP*JC&b5#
zC-Myw{J0IammoWq
z{{x*+wgD%vRa^#wj%PLr{u_Fo?*r&PbfB;vc!4pN7rreEl78sY4P(U@npuHWytF=y
zghiv+ymVye32ca;2++S5h@72iA@X-h<_DY@^X71E@h
z1ejyG_g7RV&c((tLDjR8*LPw=(VbY*OIp1g0;uRSw)h7X)PNPbw_ivA0QvL?L?o}_
zBjh)H>(ZMsva=<<`Mn?W-VxTc6gVUf04(1>tTBjzujLl~t3awe)^3NFO1R
zZ4qmReRFoQSBg|Soevo}>oyFFd91SjL!n!yLpulO0mYzm?z{@Hvq8Oi@Deg``CCEB
z-zqOhso$MZGvNR-^3dh;uUL@uHdcg3S=wNi2{Cz7T^D$qGK{3!j~c_vk}I|B__OSX
z3m^kEd#)}w!7aK1R4uHMflm!OGq546EZKH|WeQkzSmewqTkGxPEy?iCpSul
zV5)&{-<=IeV_jOr@4JLgdni2M-?DM~QgiaB;>6Ztblf
zLgl&6^NFwD2<8Z!DRD}ReFA6OkZyV`v%8fQ1aySNY~1({zGP7Na`2izg)f?aQIj_n
z*D^(&i9CL%Qrqxw3oCN`VebiSIirqrD=XZT4x61D0yJAsP_2b^O4uK6UqWp&8)+rt+(`#sO>7i33D)kk!}+DkS^R4s
zZKl=2$}X$T|LCncU%SUvmYLSq!Nda*1vW-g&Z1HB1$Bw-IA2d3QEK=_itQ)i3)O$u
z7{U*MB+i2|_YtMoaXwszqrM=^Bf@5xz<8yixubU40V@JA>XK^jGTLO;
zp|{)yWts9Lv<6a8M00wu@-d?+$yT)>zdX<1&zK
zTia6-^FONd;b&NYr@E0l4}zN9CEDcrckwU!(@OjfqZwFn2`%R=ZB-S(@4QHJ}NVSV^A^7hD-%2h^IwTC(OiiuW!dgpk
z(51ufoxrM0bV#D#8U%6^%h4H)s;08+wIszwv7XicX#C-RtU5kI&vi$we1V@X*MaQN
zd^0-E@2|7!V9y}lkzPm~%^-^ZFmtp>KG92uWz?eJ%)_!1*yrMuvd$djjcupJ54Ucq
z869I@rxxe#M`vk7dHn-n3F3h=2{MV3D<|^f7Yt{od^p_(0nN#|pD5t<`=xe=qFl5F
z-GFggOwR2z$bw_hr^A>Q$2O&pSPy+IDX9#p{LHWW0*kgWi`L{fAJcd#NFB
zcIbLx0Dus**t+vJ5h7Rdl*+^*YHJ{n&5$|cu&}UIoq>so33a}_cIi~Htz1}3vK&Px
z{<0>}#AvDF4{?Q27o}23=vY+9y|y`$UQn^^A)Uy+RUg!rK`coe&^(W(%|i&MKV^{B
zacE(c)n*zvIzXL9!IolbYE~TpJrfVXE-#U6yI2QcbA0!_-fNz}Yl`6Yx2p)i6J90)
z6(7falql8wC<*VyaV#vP?zBHNR{m4Dfh9ACn;yaLwaK0xHzyUy#{ww7#SRLC8P)Se
z{V#muR?)I)?7J@~a9>XKq}RgrCW&i)A#GBX$snRI>=KQfi9LahnM>Vq%_*M;t5b@k
zf*1o35Z_|dsuwtvqb>qVC_`lOr$<1y-@YX%hIBX+f|3dC#13Cls$lxCwhkrq{o~bl
zgL+9KeLHF7%<`*OKu*rmz{(S=y)hTkL4Qn#47_}vEc7QSeAMCseL5|2{sI{opR~fE>qQR1W&lXnPa7<~tuW_V
zKXW|?gM)N(D~H&0<%^F&*X;VC_@C%;dThE~?5kv3tqPOcXcfeEt#J#BN@t2NV%Do3kwUCO}hZAJ{&vH05
z6a$Y!W_)KQ)v;Q)xWRWl4MVP3!E?+0NrFpJHHOAG8F)lLFkNxL9E%rTbIV0P*nB*X
zLWe*f3evboD&vs78Yl*Lj8#`@2cDW$%LTO)W!o8v^Lg|09(?}WfJ{z}={P7yt}W6J
z6UhM$pjs1jPvjpZ=|?)bCn8+FLu3+g$VDw~41WBxWfqBIfJ+zo{98D21etZdgm&b2
zq(slQ4DRUU;Wd|!$33b8s+A^rsYTe2U
z86~3lHJzb_6{FhNx`CX<1s2qKr6~{Rf7%+g2XPDL30D3OouRDi5_Ji)4k4-2QjsIk
zU%Cgga1GRRoT1cv)B}m436(ndadYH!yP~_c#X>q$trJ|F{G^0klgD)Z>Ll)~wv?bQ
zknFE+Sd49_gG540fm6Cb55Z@~bgfCSD<7%sUi68XcpYD2D%4VSlII@*dNVVl^Ccjl
zlM5oUX{oPIrZMTar%O`MFdJgT5s!*PnlIkl;4nTD?~02L;BsyaF<-xEA#Pp7Fm{mv
zVvL}O$aMT+qQ&yo2caIR8S!o4JaTSRG0)pqz7at$
zht#NBm>`Yd)ekw^xQri>&KVwQDPwa2Iqz{yiu5L{E`RddG~1I&re+`~ILj=2DjlSt
zH)9<{+l_c_>?&%2#qIfgMFCWBINFT>$bIMOwzE?BI7ibQS7E0
z4%B_s3t75I`04g>lVBVAF(51dC0OJF-E@zITf13KC1;=|uE9>_dJm8Cl%MmmPeX4b
zfxF`YX3utxYzfU@rZH|?IcnT3C1^;o!Xd5PRNGjm^r-*~Q>mGX6T@U_o54JG=my*e
zwWB9qFm7U90u0*xu1OON;%&2&$EyQK5+~JFpZ|q?;VBxqA0tGbmgCA1l#I?^v@
zVk)gcL=VaqfFyJlTwD_;Q@mlABSwBz2Tk?C(XMCU@zvVxOc%q-fA&lI7!T89R~%Yy
zxx06D;BfRzZ0d!c?}#Z~DC+btH0o2QCQU@l1FPt_jdHhsMlEj-)U;qeYHx6@!kZsIPuz=k8kKkMk~V7(GqYn
zmjszw>$Nqb9Q;
zifuay5@D*R)4%tcKDefHTZq5=a$jMnTYS>~7q^)qi^;;!1)gY1#|HrACybgbnBEZ!
zv$w1u!lPG5WqOH0u5q%xBkw^7hft#4+xbA|Jb*8+=uNLJkS|VLFhaJK-?Z3MkhYQC
zPa_HE{gXi|qcRbc_}pWwjt9CI$ziRadUN+zlF4vprMe7VQb!YZ^JZUw@?Kxx{bNUu
za@vDvpa&N3g~v%FB3urKU9EB>v3$9DWmmkY*7;h~Q0=Sb(`BaC4J4KDqT0uQ5UrH?
zi)Wre^f_X}A0*w-T>B!q=NCdOLA2lef#r+h8F9_qG7dlS=0BN;GM5*~m*eOODtc0S
z_@HDEa{ph(g&v{6u7JT+P{r^
zOQUwrmOdr|d;w8cbLVM;%Tn=S^v1|G&U}4(nH7F^Dbq4Rt15@EI7+VtCV3!R|AW2U
z3P&Z@@n}U(olVn7=op-r3{S+d_0Z{
zH`eBczvR%K8`$1a7#X<4Ltb08g!LE#B+6BU7!ecW`n|P<9r74EWyA?pa1OLoFX-Bg
zJ+bM23YAd7mLJWG&6`5;b}o*M^heJ~ANTX~s}aUYqCbdl?G4~WyhfDTgt9p-KbzKT7@vENi56j?I
zO$b_B*u@#2X)@I-32IWEuyI$Oj0SlYr5v#?)Cb(id21a`1Vg|@wkfLa(aD3-r`WV-Is%0&
zy@p$481Mw{#jY+LL*Rirqw_?;Jl8Jf#@WU?xO;gmzx6s4rW}x^(Eu5^ww7Y{SUFZQ
zxIw!_zJ6PQH_a{n*+fY8rJ?yW5-QZPXGU5t(+#(sY^9TFqDqPOfGSj-qcDGu{Vmp1
zk}-E8?JHU4N=27t+n0f$CfZl3#XkJ?fMne>6q1h)lSkzwotdJS9fW<5=uQFlvTxDS
z+vU~lRbL+}IX<_4mq;j(Av^NGOxx@NL++&5cIsdFRVwK^oo*URFV|Z25LYO+dgruhWMwMt;)y`bd!y(yKHKm5p{@F2**rvLF7
zbFsSzB$?YnEgi$+d@|$6FNM155-IlXd@Ou3-{fE){l#pf=J1ayX+chbia8NPcBU@Qu+DD+3`fh@_JCMHb%xiulk08!w;`_
z*T6e^=@f_9ZE}<@KezuT3GDLQ81AbBavaLiV5_94=z?&lU`0~ui@KEX<<^n#EL$Zh
zDOlTOQBT#&hU;kBNbE%;Zm!}+l`MY#3vG+0bUbtAHK$y7oWB>B~yqqFwjQxzTP
zHWEs44`+sjd$OeUYjPJ|)2sf9n5u~*x;JdqhqC_8oe{j$Llki(%p~VA-c~ARRXgF7
zA*p1ih(tqwUJzP8FFm*=C_z@DSObYErgy^|*=DAhGj*nuRn;d|a?O@22e&tR9G03j
zBYm4R?qK7{uwZLk0s3(lPoP_8L&8*&ZRZQ{6qH|rJQ7S#h>K&z+4C;AystG?OpC7b
z;@UrC;)Z(_1lm7A&c14%-|166Ra<&;_on!u+>6YuGJ;em>LhK_e(b82*TGAR^0V;W
z?e$A_jiy)V`ss$l1+plcOY*
z%C*arF%G(03Z+jN?9E?qGDnH+t*qQMEcK59m9B)Hpj#7q
z?Lth|>ry?5cxNci8wu6lg0pW}lt}bw1Xf^F%`QTRI_s4vM@`+h#uLpw($gwYWXtR+
zimw1s`h{F3pl%rjDR1*@D}-Z}+9vkAEXEGry-dmcry+YQDThx-e#i=lUF&ic^Se!a
zs*>-Np-jYg+3~oGrpDp|g`atiW+f7M=WPYkgXSrmGF1lE!<0#?ZmE*#ew1L=(PXQs
z@t4RPP1PFii`-5pNIzZBFwYz@S!?`ulH9bZSY^U6QT8%^=_U7=$ylB4N0~LF)G^&5
zku$%eo=}QV)v21Pxyo3z?N%-5@uSLn=wR}N{utmgQtvh^+cZ8&M;{57%XWFhqAM-c
zgJ=)Tw=u-|7-o-D#wo!XyRx<(Z8a5ex=sZY#x*kG9{oVMU*ZA5hQ1ZW)4GBUp;bS)
zz{f*etYo{;v3j1)nRb-TI(!>hUu>nrwcfq^;+mti_=5h2`D
zyQf2A^1g+<65Kj$Eno}4Y2vJXh@>r9(FXTRXFfNQRS
zAA(}av{Nw8+j8ox1T^@6gAMYu0f?HHd5bqhsH75PuJ;I#B1F)i{P&69WWG1ZgBE;w
z79@08@Dy8wh*L^XOR{tnkg{nG?;MKjmWxs>x3GDFFak2L1CG)UA`a%4`}@
zkS{^m5K_(JoML;VbgyoCx2`Ab5Xw3WPiXXFA-4IFl0nLUu_)fh$A*v
zKWf5YU!XuPz1FLOfT3*G`q8LVkS9QS@Kw(GO^-A_+s_wOU-if&xB9`OeXHQn(q3*u
z@g0nIoQ1VM8WMX|kiXB3!
z%Dj!H)dF>LZHIn2cAaoZ|DN{>;@*KecX|ccE?RMKAh*A9!H1K334%aXn16LT4pf@b
zVpZ75gvO1Q^Ac;7SK85WIP~nmHg^ZRgJ8NulsQ+;d$gg#VBJJo(N=Mhqaks@K~=&*
z$DObck`>yy%?a99PC#L}_FGqx+aj5mt$;-wGVX|UDU1jkBX7HgT1I=n*@P36{&04w
zi4@S;501rz!OATg)aKE#c~xOMO-cCLwND^*YJSPQxx{e{)O6d8%q1Uv$VVK((4-P;
zs$z&eiYM|lJm*#5Ab!|Z`MsD#!YPWSMijDBRDiM-y6Q*fZ_O83y_-9Ho1kJ-I`b@B
ztS0Hl$HNPq$@uA;!9%e5^75uFqot-j$gz*G>yqeLmK_Dd`i~hOB169I!x7=5S0h
zf<@+C0Y!Ri7)iZgBhd3I*Rm9$>V^Jee7=2wWsW8k-3lx=@Xg&6+mm7Hlnt51j-a6_
zrV1+KDJrHq_kH#$BupZIs_Uv5v}pF3M73?O`R@QUHAhVhVtnK|5;CB=`}60Ii+!{d
zn2Bt*u+pOpAjAQAlwDhbcZBVIhH7J?PVC#?
z&@9de%1+7ZB98j)_9C5GW=FL5d*RAs5e-IZ-~zGm=1hQYH?DbU{k@TR7L7eczAo5;s
zx9a|_xeZGTu&)T=N!>&m_rydRrHUkUJT2hR2~6l*vsl#sp7juAK8@`m04HDC`z*2X
zs{;u>MyR`|5#bVkA&ep>+i-7aY
zb=Jt=;5f_dY$DaZEY6N7wU&xhh*hw+7DjFz%#qxHCU$>zBLVN!#{ikw%qlI65NS6C
z$QV5nySkQ@$}XLq^mcF_+dW&@|q%axy>?*W?H2pmW+LD9grLQh9$
z6r`A(#|?i?{Y`<|hWu&79psl$QBkqJpngz(@ZiCy>awyjg;|3C0bB@dwE~Ed4$HGK
zsW3rJNs8@#V
zMGh~f1yKq}Vbo1q?)W2E)~h0qrYzZ{5ygmrJuM94yCp?%KX#z-G?SygsPVM9XI`6#
zhTio$_7-ZfmMXO;subM@yZvp=w>0|1w^CF-VcVcPH|)3U51cZ4M7~#ejtQX%nOGSJ7KIqc
zKC5EYbwPdC>Sqmp?_*}(=1sgGHz07%cP~3O&l}B4ac@N%A9+Uv5CA~gg@0;N#y&zp
zq9)p3)~|a{b+h0=yDu!iII*RyI!HPx)V=?(Jxb-eBfPd-CDn(WoS8SqNM7%ErQ`>ufmHD&3k+N5lz3y
z0h;f!Gj?y%6jojhF_ugg-~jPFDa_H8pZc*nAG;2Jlsc^By3|a)ut)6iUU8QPNlMZB
z?MC(RMEG`8;t<(=IWdh{=Yj@?f?i#qr)9pS3qIaEeAL)u`Oo+*eXV+YwUCJzq
z1m9krF(<(%$&1g1>zb(*156KPKM8kov2FK}-6oU8U~L*%<6ocC!cAeJac^$7$;M)_
z*e(l|Oj@yQ55)~c&qOFSwIOW|H<_Zd
z%78NiNvN74bvL8VmIZzi-Xoe9DRrM&n~98(?AiBBI5?3+Y+^?qu|8jSx%a!L79{d+^!7T2nuGAmc+;)Y}*FBZoE>{p)
zK|