mirror of https://github.com/dapr/docs.git
Merge pull request #3621 from hhunter-ms/issue_2699
[pub/sub] remove `deadLetterTopic` for 1.8 declarative subscriptions
This commit is contained in:
commit
41468c4c5c
|
@ -37,7 +37,7 @@ jobs:
|
|||
app_location: "/daprdocs" # App source code path
|
||||
api_location: "api" # Api source code path - optional
|
||||
output_location: "public" # Built app content directory - optional
|
||||
app_build_command: "hugo"
|
||||
app_build_command: "git config --global --add safe.directory /github/workspace && hugo"
|
||||
###### End of Repository/Build Configurations ######
|
||||
|
||||
close_pull_request_job:
|
||||
|
|
|
@ -27,7 +27,7 @@ Alternatively, you can use the Dapr SDK in [.NET]({{< ref "dotnet-actors" >}}),
|
|||
Actors can save state reliably using state management capability.
|
||||
You can interact with Dapr through HTTP/gRPC endpoints for state management.
|
||||
|
||||
To use actors, your state store must support multi-item transactions. This means your state store [component](https://github.com/dapr/components-contrib/tree/master/state) must implement the [TransactionalStore](https://github.com/dapr/components-contrib/blob/master/state/transactional_store.go) interface. The list of components that support transactions/actors can be found here: [supported state stores]({{< ref supported-state-stores.md >}}). Only a single state store component can be used as the statestore for all actors.
|
||||
To use actors, your state store must support multi-item transactions. This means your state store [component](https://github.com/dapr/components-contrib/tree/master/state) must implement the `TransactionalStore` interface. The list of components that support transactions/actors can be found here: [supported state stores]({{< ref supported-state-stores.md >}}). Only a single state store component can be used as the statestore for all actors.
|
||||
|
||||
## Actor timers and reminders
|
||||
|
||||
|
|
|
@ -139,7 +139,7 @@ You can create a trace context using the recommended OpenCensus SDKs. OpenCensus
|
|||
| C# | [Link](https://github.com/census-instrumentation/opencensus-csharp/)
|
||||
| C++ | [Link](https://github.com/census-instrumentation/opencensus-cpp)
|
||||
| Node.js | [Link](https://github.com/census-instrumentation/opencensus-node)
|
||||
| Python | [Link](https://census-instrumentation.github.io/opencensus-python/trace/api/index.html)
|
||||
| Python | [Link](https://github.com/census-instrumentation/opencensus-python/tree/master/opencensus/trace)
|
||||
|
||||
### Create trace context in Go
|
||||
|
||||
|
|
|
@ -20,24 +20,6 @@ The diagram below is an example of how dead letter topics work. First a message
|
|||
|
||||
<img src="/images/pubsub_deadletter.png" width=1200>
|
||||
|
||||
## Configuring a dead letter topic with a declarative subscription
|
||||
|
||||
The following YAML shows how to configure a subscription with a dead letter topic named `poisonMessages` for messages consumed from the `orders` topic. This subscription is scoped to an app with a `checkout` ID.
|
||||
|
||||
```yaml
|
||||
apiVersion: dapr.io/v1alpha1
|
||||
kind: Subscription
|
||||
metadata:
|
||||
name: order
|
||||
spec:
|
||||
topic: orders
|
||||
route: /checkout
|
||||
pubsubname: pubsub
|
||||
deadLetterTopic: poisonMessages
|
||||
scopes:
|
||||
- checkout
|
||||
```
|
||||
|
||||
## Configuring a dead letter topic with programmatic subscription
|
||||
|
||||
The JSON returned from the `/subscribe` endpoint shows how to configure a dead letter topic named `poisonMessages` for messages consumed from the `orders` topic.
|
||||
|
|
|
@ -11,7 +11,7 @@ Let's take a look at Dapr's [Bindings building block]({{< ref bindings >}}). Usi
|
|||
- Trigger your app with events coming in from external systems.
|
||||
- Interface with external systems.
|
||||
|
||||
In this Quickstart, you will schedule a batch script to run every 10 seconds using an input [Cron](https://docs.dapr.io/reference/components-reference/supported-bindings/cron/) binding. The script processes a JSON file and outputs data to a SQL database using the [PostgreSQL](https://docs.dapr.io/reference/components-reference/supported-bindings/postgres) Dapr binding.
|
||||
In this Quickstart, you will schedule a batch script to run every 10 seconds using an input [Cron](https://docs.dapr.io/reference/components-reference/supported-bindings/cron/) binding. The script processes a JSON file and outputs data to a SQL database using the [PostgreSQL](https:/v1-8.docs.dapr.io/reference/components-reference/supported-bindings/postgres) Dapr binding.
|
||||
|
||||
<img src="/images/bindings-quickstart/bindings-quickstart.png" width=800 style="padding-bottom:15px;">
|
||||
|
||||
|
|
|
@ -75,4 +75,4 @@ By default, tailing is set to /var/log/containers/*.log. To change this setting,
|
|||
* [Telemetry Data Platform](https://newrelic.com/platform/telemetry-data-platform)
|
||||
* [New Relic Logging](https://github.com/newrelic/helm-charts/tree/master/charts/newrelic-logging)
|
||||
* [Types of New Relic API keys](https://docs.newrelic.com/docs/apis/intro-apis/new-relic-api-keys/)
|
||||
* [Alerts and Applied Intelligence](https://docs.newrelic.com/docs/alerts-applied-intelligence/new-relic-alerts/learn-alerts/alerts-ai-transition-guide-2022/)
|
||||
* [Alerts and Applied Intelligence](https://docs.newrelic.com/docs/alerts-applied-intelligence/overview/)
|
||||
|
|
|
@ -40,4 +40,4 @@ This document explains how to install it in your cluster, either using a Helm ch
|
|||
* [Telemetry Data Platform](https://newrelic.com/platform/telemetry-data-platform)
|
||||
* [New Relic Prometheus OpenMetrics Integration](https://github.com/newrelic/helm-charts/tree/master/charts/nri-prometheus)
|
||||
* [Types of New Relic API keys](https://docs.newrelic.com/docs/apis/intro-apis/new-relic-api-keys/)
|
||||
* [Alerts and Applied Intelligence](https://docs.newrelic.com/docs/alerts-applied-intelligence/new-relic-alerts/learn-alerts/alerts-ai-transition-guide-2022/)
|
||||
* [Alerts and Applied Intelligence](https://docs.newrelic.com/docs/alerts-applied-intelligence/overview/)
|
||||
|
|
|
@ -101,7 +101,7 @@ And the exact same dashboard templates from Dapr can be imported to visualize Da
|
|||
|
||||
## New Relic Alerts
|
||||
|
||||
All the data that is collected from Dapr, Kubernetes or any services that run on top of can be used to set-up alerts and notifications into the preferred channel of your choice. See [Alerts and Applied Intelligence](https://docs.newrelic.com/docs/alerts-applied-intelligence/new-relic-alerts/learn-alerts/alerts-ai-transition-guide-2022/).
|
||||
All the data that is collected from Dapr, Kubernetes or any services that run on top of can be used to set-up alerts and notifications into the preferred channel of your choice. See [Alerts and Applied Intelligence](https://docs.newrelic.com/docs/alerts-applied-intelligence/overview/).
|
||||
|
||||
## Related Links/References
|
||||
|
||||
|
@ -111,4 +111,4 @@ All the data that is collected from Dapr, Kubernetes or any services that run on
|
|||
* [New Relic Trace API](https://docs.newrelic.com/docs/distributed-tracing/trace-api/introduction-trace-api/)
|
||||
* [Types of New Relic API keys](https://docs.newrelic.com/docs/apis/intro-apis/new-relic-api-keys/)
|
||||
* [New Relic OpenTelemetry User Experience](https://blog.newrelic.com/product-news/opentelemetry-user-experience/)
|
||||
* [Alerts and Applied Intelligence](https://docs.newrelic.com/docs/alerts-applied-intelligence/new-relic-alerts/learn-alerts/alerts-ai-transition-guide-2022/)
|
||||
* [Alerts and Applied Intelligence](https://docs.newrelic.com/docs/alerts-applied-intelligence/overview/)
|
||||
|
|
|
@ -51,7 +51,7 @@ The above example uses secrets as plain strings. It is recommended to use a secr
|
|||
| clientSecret | Y | Output | The commercetools client secret for the project | `"client secret"` |
|
||||
| scopes | Y | Output | The commercetools scopes for the project | `"manage_project:project-key"` |
|
||||
|
||||
For more information see [commercetools - Creating an API Client](https://docs.commercetools.com/tutorials/getting-started#creating-an-api-client) and [commercetools - Regions](https://docs.commercetools.com/api/general-concepts#regions).
|
||||
For more information see [commercetools - Creating an API Client](https://docs.commercetools.com/getting-started/create-api-client) and [commercetools - Regions](https://docs.commercetools.com/api/general-concepts#regions).
|
||||
|
||||
## Binding support
|
||||
|
||||
|
|
|
@ -333,7 +333,7 @@ To run without Docker, see the getting started guide [here](https://kafka.apache
|
|||
{{% /codetab %}}
|
||||
|
||||
{{% codetab %}}
|
||||
To run Kafka on Kubernetes, you can use any Kafka operator, such as [Strimzi](https://strimzi.io/docs/operators/latest/quickstart.html#ref-install-prerequisites-str).
|
||||
To run Kafka on Kubernetes, you can use any Kafka operator, such as [Strimzi](https://strimzi.io/docs/operators/latest/overview.html).
|
||||
{{% /codetab %}}
|
||||
|
||||
{{< /tabs >}}
|
||||
|
|
Loading…
Reference in New Issue