From 582f8a1a4ff8f4193a80f88dcc6be06301f10a15 Mon Sep 17 00:00:00 2001 From: Chris Gillum Date: Tue, 16 May 2023 16:44:56 -0700 Subject: [PATCH 1/6] [Workflow] Add retry policy docs Signed-off-by: Chris Gillum --- .../workflow/workflow-features-concepts.md | 30 +++++++++++++++++++ 1 file changed, 30 insertions(+) diff --git a/daprdocs/content/en/developing-applications/building-blocks/workflow/workflow-features-concepts.md b/daprdocs/content/en/developing-applications/building-blocks/workflow/workflow-features-concepts.md index 1ae7c3dd8..a245c254e 100644 --- a/daprdocs/content/en/developing-applications/building-blocks/workflow/workflow-features-concepts.md +++ b/daprdocs/content/en/developing-applications/building-blocks/workflow/workflow-features-concepts.md @@ -105,6 +105,36 @@ Dapr Workflows allow you to schedule reminder-like durable delays for any time r Some APIs in the workflow authoring SDK may internally schedule durable timers to implement internal timeout behavior. {{% /alert %}} +## Retry policies + +Workflows support durable retry policies for activities and child workflows. Workflow retry policies are separate and distinct from [Dapr resiliency policies]({{< ref "resiliency-overview.md" >}}) in the following ways. + +- Workflow retry policies are configured by the workflow author in code, whereas Dapr Resiliency policies are configured by the application operator in YAML. +- Workflow retry policies are durable and maintain their state across application restarts, whereas Dapr Resiliency policies are not durable and must be re-applied after application restarts. +- Workflow retry policies are triggered by unhandled errors/exceptions in activities and child workflows, whereas Dapr Resiliency policies are triggered by operation timeouts and connectivity faults. + +Retries are internally implemented using durable timers. This means that workflows can be safely unloaded from memory while waiting for a retry to fire, conserving system resources. This also means that delays between retries can be arbitrarily long, including minutes, hours, or even days. + +{{% alert title="Note" color="primary" %}} +The actions performed by a retry policy are saved into a workflow's history. Care must be taken not to change the behavior of a retry policy after a workflow has already been executed. Otherwise, the workflow may behave unexpectedly when replayed. See the notes on [updating workflow code]({{< ref "#updating-workflow-code" >}}) for more information. +{{% /alert %}} + +It's possible to use both workflow retry policies and Dapr Resiliency policies together. For example, if a workflow activity uses a Dapr Client to invoke a service, the Dapr Client will use the configured resiliency policy, if any. However, if the activity itself fails for any reason, including exhausting the retries on the resiliency policy, then the workflow's resiliency policy kicks in. + +{{% alert title="Note" color="primary" %}} +Using workflow retry policies and resiliency policies together can result in unexpected behavior. For example, if a workflow activity exhausts its configured retry policy, the workflow engine will still retry the activity according to the workflow retry policy. This can result in the activity being retried more times than expected. +{{% /alert %}} + +Because workflow retry policies are configured in code, the exact developer experience may vary depending on the version of the workflow SDK. In general, workflow retry policies can be configured with the following parameters. + +| Parameter | Description | +| --- | --- | +| **Maximum number of attempts** | The maximum number of times to execute the activity or child workflow. | +| **First retry interval** | The amount of time to wait before the first retry. | +| **Backoff coefficient** | The amount of time to wait before each subsequent retry. | +| **Maximum retry interval** | The maximum amount of time to wait before each subsequent retry. | +| **Retry timeout** | The overall timeout for retries, regardless of any configured max number of attempts. | + ## External events Sometimes workflows will need to wait for events that are raised by external systems. For example, an approval workflow may require a human to explicitly approve an order request within an order processing workflow if the total cost exceeds some threshold. Another example is a trivia game orchestration workflow that pauses while waiting for all participants to submit their answers to trivia questions. These mid-execution inputs are referred to as _external events_. From 08f5b29b5a524434e626eef732ebaf777c7ad6a1 Mon Sep 17 00:00:00 2001 From: Chris Gillum Date: Tue, 16 May 2023 21:23:11 -0700 Subject: [PATCH 2/6] Apply msfussell suggestion on resiliency policy reference Co-authored-by: Mark Fussell Signed-off-by: Chris Gillum --- .../building-blocks/workflow/workflow-features-concepts.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/daprdocs/content/en/developing-applications/building-blocks/workflow/workflow-features-concepts.md b/daprdocs/content/en/developing-applications/building-blocks/workflow/workflow-features-concepts.md index a245c254e..456499676 100644 --- a/daprdocs/content/en/developing-applications/building-blocks/workflow/workflow-features-concepts.md +++ b/daprdocs/content/en/developing-applications/building-blocks/workflow/workflow-features-concepts.md @@ -119,7 +119,7 @@ Retries are internally implemented using durable timers. This means that workflo The actions performed by a retry policy are saved into a workflow's history. Care must be taken not to change the behavior of a retry policy after a workflow has already been executed. Otherwise, the workflow may behave unexpectedly when replayed. See the notes on [updating workflow code]({{< ref "#updating-workflow-code" >}}) for more information. {{% /alert %}} -It's possible to use both workflow retry policies and Dapr Resiliency policies together. For example, if a workflow activity uses a Dapr Client to invoke a service, the Dapr Client will use the configured resiliency policy, if any. However, if the activity itself fails for any reason, including exhausting the retries on the resiliency policy, then the workflow's resiliency policy kicks in. +It's possible to use both workflow retry policies and Dapr Resiliency policies together. For example, if a workflow activity uses a Dapr client to invoke a service, the Dapr client uses the configured resiliency policy. See [Quickstart: Service-to-service resiliency]({{< ref "#resiliency-serviceinvo-quickstart" >}}) for more information with an example. However, if the activity itself fails for any reason, including exhausting the retries on the resiliency policy, then the workflow's resiliency policy kicks in. {{% alert title="Note" color="primary" %}} Using workflow retry policies and resiliency policies together can result in unexpected behavior. For example, if a workflow activity exhausts its configured retry policy, the workflow engine will still retry the activity according to the workflow retry policy. This can result in the activity being retried more times than expected. From 2ddf28688ca66d13fb138cfed93bbf297458de77 Mon Sep 17 00:00:00 2001 From: ItalyPaleAle <43508+ItalyPaleAle@users.noreply.github.com> Date: Wed, 17 May 2023 13:55:22 -0700 Subject: [PATCH 3/6] Standardize PostgreSQL components' name Fixes #3200 Signed-off-by: ItalyPaleAle <43508+ItalyPaleAle@users.noreply.github.com> --- daprdocs/content/en/concepts/overview.md | 2 +- .../quickstarts/bindings-quickstart.md | 74 +++++++++---------- .../{postgres.md => postgresql.md} | 10 +-- .../postgres-configuration-store.md | 21 +++--- .../setup-postgresql.md | 2 +- .../data/components/bindings/generic.yaml | 4 +- .../configuration_stores/generic.yaml | 4 +- 7 files changed, 59 insertions(+), 58 deletions(-) rename daprdocs/content/en/reference/components-reference/supported-bindings/{postgres.md => postgresql.md} (88%) diff --git a/daprdocs/content/en/concepts/overview.md b/daprdocs/content/en/concepts/overview.md index 1d19e3aad..4a6263e7f 100644 --- a/daprdocs/content/en/concepts/overview.md +++ b/daprdocs/content/en/concepts/overview.md @@ -36,7 +36,7 @@ Each of these building block APIs is independent, meaning that you can use one, | Building Block | Description | |----------------|-------------| | [**Service-to-service invocation**]({{< ref "service-invocation-overview.md" >}}) | Resilient service-to-service invocation enables method calls, including retries, on remote services, wherever they are located in the supported hosting environment. -| [**State management**]({{< ref "state-management-overview.md" >}}) | With state management for storing and querying key/value pairs, long-running, highly available, stateful services can be easily written alongside stateless services in your application. The state store is pluggable and examples include AWS DynamoDB, Azure CosmosDB, Azure SQL Server, GCP Firebase, PostgreSQL or Redis, among others. +| [**State management**]({{< ref "state-management-overview.md" >}}) | With state management for storing and querying key/value pairs, long-running, highly available, stateful services can be easily written alongside stateless services in your application. The state store is pluggable and examples include AWS DynamoDB, Azure Cosmos DB, Azure SQL Server, GCP Firebase, PostgreSQL or Redis, among others. | [**Publish and subscribe**]({{< ref "pubsub-overview.md" >}}) | Publishing events and subscribing to topics between services enables event-driven architectures to simplify horizontal scalability and make them resilient to failure. Dapr provides at-least-once message delivery guarantee, message TTL, consumer groups and other advance features. | [**Resource bindings**]({{< ref "bindings-overview.md" >}}) | Resource bindings with triggers builds further on event-driven architectures for scale and resiliency by receiving and sending events to and from any external source such as databases, queues, file systems, etc. | [**Actors**]({{< ref "actors-overview.md" >}}) | A pattern for stateful and stateless objects that makes concurrency simple, with method and state encapsulation. Dapr provides many capabilities in its actor runtime, including concurrency, state, and life-cycle management for actor activation/deactivation, and timers and reminders to wake up actors. diff --git a/daprdocs/content/en/getting-started/quickstarts/bindings-quickstart.md b/daprdocs/content/en/getting-started/quickstarts/bindings-quickstart.md index fd8cb0d37..377cf7d61 100644 --- a/daprdocs/content/en/getting-started/quickstarts/bindings-quickstart.md +++ b/daprdocs/content/en/getting-started/quickstarts/bindings-quickstart.md @@ -11,7 +11,7 @@ Let's take a look at Dapr's [Bindings building block]({{< ref bindings >}}). Usi - Trigger your app with events coming in from external systems. - Interface with external systems. -In this Quickstart, you will schedule a batch script to run every 10 seconds using an input [Cron](https://docs.dapr.io/reference/components-reference/supported-bindings/cron/) binding. The script processes a JSON file and outputs data to a SQL database using the [PostgreSQL](https://docs.dapr.io/reference/components-reference/supported-bindings/postgres) Dapr binding. +In this Quickstart, you will schedule a batch script to run every 10 seconds using an input [Cron](https://docs.dapr.io/reference/components-reference/supported-bindings/cron/) binding. The script processes a JSON file and outputs data to a SQL database using the [PostgreSQL](https://docs.dapr.io/reference/components-reference/supported-bindings/postgresql) Dapr binding. @@ -98,7 +98,7 @@ The code inside the `process_batch` function is executed every 10 seconds (defin def process_batch(): ``` -The `batch-sdk` service uses the PostgreSQL output binding defined in the [`binding-postgres.yaml`]({{< ref "#componentbinding-postgresyaml-component-file" >}}) component to insert the `OrderId`, `Customer`, and `Price` records into the `orders` table. +The `batch-sdk` service uses the PostgreSQL output binding defined in the [`binding-postgresql.yaml`]({{< ref "#componentbinding-postgresyaml-component-file" >}}) component to insert the `OrderId`, `Customer`, and `Price` records into the `orders` table. ```python with DaprClient() as d: @@ -140,7 +140,7 @@ In a new terminal, verify the same data has been inserted into the database. Nav cd bindings/db ``` -Run the following to start the interactive Postgres CLI: +Run the following to start the interactive *psql* CLI: ```bash docker exec -i -t postgres psql --username postgres -p 5432 -h localhost --no-password @@ -193,16 +193,16 @@ spec: **Note:** The `metadata` section of `binding-cron.yaml` contains a [Cron expression]({{< ref cron.md >}}) that specifies how often the binding is invoked. -#### `component\binding-postgres.yaml` component file +#### `component\binding-postgresql.yaml` component file When you execute the `dapr run` command and specify the component path, the Dapr sidecar: -- Initiates the PostgreSQL [binding building block]({{< ref postgres.md >}}) -- Connects to PostgreSQL using the settings specified in the `binding-postgres.yaml` file +- Initiates the PostgreSQL [binding building block]({{< ref postgresql.md >}}) +- Connects to PostgreSQL using the settings specified in the `binding-postgresql.yaml` file -With the `binding-postgres.yaml` component, you can easily swap out the backend database [binding]({{< ref supported-bindings.md >}}) without making code changes. +With the `binding-postgresql.yaml` component, you can easily swap out the backend database [binding]({{< ref supported-bindings.md >}}) without making code changes. -The PostgreSQL `binding-postgres.yaml` file included for this Quickstart contains the following: +The PostgreSQL `binding-postgresql.yaml` file included for this Quickstart contains the following: ```yaml apiVersion: dapr.io/v1alpha1 @@ -211,7 +211,7 @@ metadata: name: sqldb namespace: quickstarts spec: - type: bindings.postgres + type: bindings.postgresql version: v1 metadata: - name: url # Required @@ -304,7 +304,7 @@ async function start() { } ``` -The `batch-sdk` service uses the PostgreSQL output binding defined in the [`binding-postgres.yaml`]({{< ref "##componentsbinding-postgresyaml-component-file" >}}) component to insert the `OrderId`, `Customer`, and `Price` records into the `orders` table. +The `batch-sdk` service uses the PostgreSQL output binding defined in the [`binding-postgresql.yaml`]({{< ref "##componentsbinding-postgresyaml-component-file" >}}) component to insert the `OrderId`, `Customer`, and `Price` records into the `orders` table. ```javascript async function processBatch(){ @@ -395,16 +395,16 @@ spec: **Note:** The `metadata` section of `binding-cron.yaml` contains a [Cron expression]({{< ref cron.md >}}) that specifies how often the binding is invoked. -#### `component\binding-postgres.yaml` component file +#### `component\binding-postgresql.yaml` component file When you execute the `dapr run` command and specify the component path, the Dapr sidecar: -- Initiates the PostgreSQL [binding building block]({{< ref postgres.md >}}) -- Connects to PostgreSQL using the settings specified in the `binding-postgres.yaml` file +- Initiates the PostgreSQL [binding building block]({{< ref postgresql.md >}}) +- Connects to PostgreSQL using the settings specified in the `binding-postgresql.yaml` file -With the `binding-postgres.yaml` component, you can easily swap out the backend database [binding]({{< ref supported-bindings.md >}}) without making code changes. +With the `binding-postgresql.yaml` component, you can easily swap out the backend database [binding]({{< ref supported-bindings.md >}}) without making code changes. -The PostgreSQL `binding-postgres.yaml` file included for this Quickstart contains the following: +The PostgreSQL `binding-postgresql.yaml` file included for this Quickstart contains the following: ```yaml apiVersion: dapr.io/v1alpha1 @@ -413,7 +413,7 @@ metadata: name: sqldb namespace: quickstarts spec: - type: bindings.postgres + type: bindings.postgresql version: v1 metadata: - name: url # Required @@ -506,7 +506,7 @@ app.MapPost("/" + cronBindingName, async () => { }); ``` -The `batch-sdk` service uses the PostgreSQL output binding defined in the [`binding-postgres.yaml`]({{< ref "#componentbinding-postgresyaml-component-file" >}}) component to insert the `OrderId`, `Customer`, and `Price` records into the `orders` table. +The `batch-sdk` service uses the PostgreSQL output binding defined in the [`binding-postgresql.yaml`]({{< ref "#componentbinding-postgresyaml-component-file" >}}) component to insert the `OrderId`, `Customer`, and `Price` records into the `orders` table. ```csharp // ... @@ -599,16 +599,16 @@ spec: **Note:** The `metadata` section of `binding-cron.yaml` contains a [Cron expression]({{< ref cron.md >}}) that specifies how often the binding is invoked. -#### `component\binding-postgres.yaml` component file +#### `component\binding-postgresql.yaml` component file When you execute the `dapr run` command and specify the component path, the Dapr sidecar: -- Initiates the PostgreSQL [binding building block]({{< ref postgres.md >}}) -- Connects to PostgreSQL using the settings specified in the `binding-postgres.yaml` file +- Initiates the PostgreSQL [binding building block]({{< ref postgresql.md >}}) +- Connects to PostgreSQL using the settings specified in the `binding-postgresql.yaml` file -With the `binding-postgres.yaml` component, you can easily swap out the backend database [binding]({{< ref supported-bindings.md >}}) without making code changes. +With the `binding-postgresql.yaml` component, you can easily swap out the backend database [binding]({{< ref supported-bindings.md >}}) without making code changes. -The PostgreSQL `binding-postgres.yaml` file included for this Quickstart contains the following: +The PostgreSQL `binding-postgresql.yaml` file included for this Quickstart contains the following: ```yaml apiVersion: dapr.io/v1alpha1 @@ -617,7 +617,7 @@ metadata: name: sqldb namespace: quickstarts spec: - type: bindings.postgres + type: bindings.postgresql version: v1 metadata: - name: url # Required @@ -711,7 +711,7 @@ The code inside the `process_batch` function is executed every 10 seconds (defin public ResponseEntity processBatch() throws IOException, Exception ``` -The `batch-sdk` service uses the PostgreSQL output binding defined in the [`binding-postgres.yaml`]({{< ref "#componentbinding-postgresyaml-component-file" >}}) component to insert the `OrderId`, `Customer`, and `Price` records into the `orders` table. +The `batch-sdk` service uses the PostgreSQL output binding defined in the [`binding-postgresql.yaml`]({{< ref "#componentbinding-postgresyaml-component-file" >}}) component to insert the `OrderId`, `Customer`, and `Price` records into the `orders` table. ```java try (DaprClient client = new DaprClientBuilder().build()) { @@ -809,16 +809,16 @@ spec: **Note:** The `metadata` section of `binding-cron.yaml` contains a [Cron expression]({{< ref cron.md >}}) that specifies how often the binding is invoked. -#### `component\binding-postgres.yaml` component file +#### `component\binding-postgresql.yaml` component file When you execute the `dapr run` command and specify the component path, the Dapr sidecar: -- Initiates the PostgreSQL [binding building block]({{< ref postgres.md >}}) -- Connects to PostgreSQL using the settings specified in the `binding-postgres.yaml` file +- Initiates the PostgreSQL [binding building block]({{< ref postgresql.md >}}) +- Connects to PostgreSQL using the settings specified in the `binding-postgresql.yaml` file -With the `binding-postgres.yaml` component, you can easily swap out the backend database [binding]({{< ref supported-bindings.md >}}) without making code changes. +With the `binding-postgresql.yaml` component, you can easily swap out the backend database [binding]({{< ref supported-bindings.md >}}) without making code changes. -The PostgreSQL `binding-postgres.yaml` file included for this Quickstart contains the following: +The PostgreSQL `binding-postgresql.yaml` file included for this Quickstart contains the following: ```yaml apiVersion: dapr.io/v1alpha1 @@ -827,7 +827,7 @@ metadata: name: sqldb namespace: quickstarts spec: - type: bindings.postgres + type: bindings.postgresql version: v1 metadata: - name: url # Required @@ -918,7 +918,7 @@ The code inside the `process_batch` function is executed every 10 seconds (defin r.HandleFunc("/"+cronBindingName, processBatch).Methods("POST") ``` -The `batch-sdk` service uses the PostgreSQL output binding defined in the [`binding-postgres.yaml`]({{< ref "#componentbinding-postgresyaml-component-file" >}}) component to insert the `OrderId`, `Customer`, and `Price` records into the `orders` table. +The `batch-sdk` service uses the PostgreSQL output binding defined in the [`binding-postgresql.yaml`]({{< ref "#componentbinding-postgresyaml-component-file" >}}) component to insert the `OrderId`, `Customer`, and `Price` records into the `orders` table. ```go func sqlOutput(order Order) (err error) { @@ -1021,16 +1021,16 @@ spec: **Note:** The `metadata` section of `binding-cron.yaml` contains a [Cron expression]({{< ref cron.md >}}) that specifies how often the binding is invoked. -#### `component\binding-postgres.yaml` component file +#### `component\binding-postgresql.yaml` component file When you execute the `dapr run` command and specify the component path, the Dapr sidecar: -- Initiates the PostgreSQL [binding building block]({{< ref postgres.md >}}) -- Connects to PostgreSQL using the settings specified in the `binding-postgres.yaml` file +- Initiates the PostgreSQL [binding building block]({{< ref postgresql.md >}}) +- Connects to PostgreSQL using the settings specified in the `binding-postgresql.yaml` file -With the `binding-postgres.yaml` component, you can easily swap out the backend database [binding]({{< ref supported-bindings.md >}}) without making code changes. +With the `binding-postgresql.yaml` component, you can easily swap out the backend database [binding]({{< ref supported-bindings.md >}}) without making code changes. -The PostgreSQL `binding-postgres.yaml` file included for this Quickstart contains the following: +The PostgreSQL `binding-postgresql.yaml` file included for this Quickstart contains the following: ```yaml apiVersion: dapr.io/v1alpha1 @@ -1039,7 +1039,7 @@ metadata: name: sqldb namespace: quickstarts spec: - type: bindings.postgres + type: bindings.postgresql version: v1 metadata: - name: url # Required diff --git a/daprdocs/content/en/reference/components-reference/supported-bindings/postgres.md b/daprdocs/content/en/reference/components-reference/supported-bindings/postgresql.md similarity index 88% rename from daprdocs/content/en/reference/components-reference/supported-bindings/postgres.md rename to daprdocs/content/en/reference/components-reference/supported-bindings/postgresql.md index 2a423dfa7..fdefee5cc 100644 --- a/daprdocs/content/en/reference/components-reference/supported-bindings/postgres.md +++ b/daprdocs/content/en/reference/components-reference/supported-bindings/postgresql.md @@ -4,12 +4,13 @@ title: "PostgreSQL binding spec" linkTitle: "PostgreSQL" description: "Detailed documentation on the PostgreSQL binding component" aliases: + - "/operations/components/setup-bindings/supported-bindings/postgresql/" - "/operations/components/setup-bindings/supported-bindings/postgres/" --- ## Component format -To setup PostgreSQL binding create a component of type `bindings.postgres`. See [this guide]({{< ref "howto-bindings.md#1-create-a-binding" >}}) on how to create and apply a binding configuration. +To setup PostgreSQL binding create a component of type `bindings.postgresql`. See [this guide]({{< ref "howto-bindings.md#1-create-a-binding" >}}) on how to create and apply a binding configuration. ```yaml @@ -18,7 +19,7 @@ kind: Component metadata: name: spec: - type: bindings.postgres + type: bindings.postgresql version: v1 metadata: - name: url # Required @@ -33,7 +34,7 @@ The above example uses secrets as plain strings. It is recommended to use a secr | Field | Required | Binding support | Details | Example | |--------------------|:--------:|------------|-----|---------| -| url | Y | Output | Postgres connection string See [here](#url-format) for more details | `"user=dapr password=secret host=dapr.example.com port=5432 dbname=dapr sslmode=verify-ca"` | +| url | Y | Output | PostgreSQL connection string See [here](#url-format) for more details | `"user=dapr password=secret host=dapr.example.com port=5432 dbname=dapr sslmode=verify-ca"` | ### URL format @@ -144,8 +145,7 @@ Finally, the `close` operation can be used to explicitly close the DB connection } ``` - -> Note, the PostgreSql binding itself doesn't prevent SQL injection, like with any database application, validate the input before executing query. +> Note, the PostgreSQL binding itself doesn't prevent SQL injection, like with any database application, validate the input before executing query. ## Related links diff --git a/daprdocs/content/en/reference/components-reference/supported-configuration-stores/postgres-configuration-store.md b/daprdocs/content/en/reference/components-reference/supported-configuration-stores/postgres-configuration-store.md index 35c74f030..4a6991748 100644 --- a/daprdocs/content/en/reference/components-reference/supported-configuration-stores/postgres-configuration-store.md +++ b/daprdocs/content/en/reference/components-reference/supported-configuration-stores/postgres-configuration-store.md @@ -1,15 +1,16 @@ --- type: docs -title: "Postgres" -linkTitle: "Postgres" -description: Detailed information on the Postgres configuration store component +title: "PostgreSQL" +linkTitle: "PostgreSQL" +description: Detailed information on the PostgreSQL configuration store component aliases: + - "/operations/components/setup-configuration-store/supported-configuration-stores/setup-postgresql/" - "/operations/components/setup-configuration-store/supported-configuration-stores/setup-postgres/" --- ## Component format -To set up an Postgres configuration store, create a component of type `configuration.postgres` +To set up an PostgreSQL configuration store, create a component of type `configuration.postgresql` ```yaml apiVersion: dapr.io/v1alpha1 @@ -17,7 +18,7 @@ kind: Component metadata: name: spec: - type: configuration.postgres + type: configuration.postgresql version: v1 metadata: - name: connectionString @@ -40,10 +41,10 @@ The above example uses secrets as plain strings. It is recommended to use a secr | connectionString | Y | The connection string for PostgreSQL. Default pool_max_conns = 5 | `"host=localhost user=postgres password=example port=5432 connect_timeout=10 database=dapr_test pool_max_conns=10"` | table | Y | table name for configuration information. | `configTable` -## Set up Postgres as Configuration Store +## Set up PostgreSQL as Configuration Store -1. Start Postgres Database -1. Connect to the Postgres database and setup a configuration table with following schema - +1. Start PostgreSQL Database +1. Connect to the PostgreSQL database and setup a configuration table with following schema - | Field | Datatype | Nullable |Details | |--------------------|:--------:|---------|---------| @@ -101,13 +102,13 @@ AFTER INSERT OR UPDATE OR DELETE ON configTable 7. In the subscribe request add an additional metadata field with key as `pgNotifyChannel` and value should be set to same `channel name` mentioned in `pg_notify`. From the above example, it should be set to `config` {{% alert title="Note" color="primary" %}} -When calling `subscribe` API, `metadata.pgNotifyChannel` should be used to specify the name of the channel to listen for notifications from Postgres configuration store. +When calling `subscribe` API, `metadata.pgNotifyChannel` should be used to specify the name of the channel to listen for notifications from PostgreSQL configuration store. Any number of keys can be added to a subscription request. Each subscription uses an exclusive database connection. It is strongly recommended to subscribe to multiple keys within a single subscription. This helps optimize the number of connections to the database. Example of subscribe HTTP API - ```ps -curl --location --request GET 'http://:/configuration/postgres/subscribe?key=&key=&metadata.pgNotifyChannel=' +curl --location --request GET 'http://:/configuration/mypostgresql/subscribe?key=&key=&metadata.pgNotifyChannel=' ``` {{% /alert %}} diff --git a/daprdocs/content/en/reference/components-reference/supported-state-stores/setup-postgresql.md b/daprdocs/content/en/reference/components-reference/supported-state-stores/setup-postgresql.md index 263381df3..d8472e840 100644 --- a/daprdocs/content/en/reference/components-reference/supported-state-stores/setup-postgresql.md +++ b/daprdocs/content/en/reference/components-reference/supported-state-stores/setup-postgresql.md @@ -11,7 +11,7 @@ This component allows using PostgreSQL (Postgres) as state store for Dapr. ## Create a Dapr component -Create a file called `postgres.yaml`, paste the following and replace the `` value with your connection string. The connection string is a standard PostgreSQL connection string. For example, `"host=localhost user=postgres password=example port=5432 connect_timeout=10 database=dapr_test"`. See the PostgreSQL [documentation on database connections](https://www.postgresql.org/docs/current/libpq-connect.html) for information on how to define a connection string. +Create a file called `postgresql.yaml`, paste the following and replace the `` value with your connection string. The connection string is a standard PostgreSQL connection string. For example, `"host=localhost user=postgres password=example port=5432 connect_timeout=10 database=dapr_test"`. See the PostgreSQL [documentation on database connections](https://www.postgresql.org/docs/current/libpq-connect.html) for information on how to define a connection string. If you want to also configure PostgreSQL to store actors, add the `actorStateStore` option as in the example below. diff --git a/daprdocs/data/components/bindings/generic.yaml b/daprdocs/data/components/bindings/generic.yaml index df07ddd27..f6ce33e6e 100644 --- a/daprdocs/data/components/bindings/generic.yaml +++ b/daprdocs/data/components/bindings/generic.yaml @@ -70,8 +70,8 @@ features: input: false output: true -- component: PostgreSql - link: postgres +- component: PostgreSQL + link: postgresql state: Stable version: v1 since: "1.9" diff --git a/daprdocs/data/components/configuration_stores/generic.yaml b/daprdocs/data/components/configuration_stores/generic.yaml index 6377c08cd..e51482aa3 100644 --- a/daprdocs/data/components/configuration_stores/generic.yaml +++ b/daprdocs/data/components/configuration_stores/generic.yaml @@ -3,8 +3,8 @@ state: Stable version: v1 since: "1.11" -- component: Postgres - link: postgres-configuration-store +- component: PostgreSQL + link: postgresql-configuration-store state: Stable version: v1 since: "1.11" From f975deefbb5c82e52ac6e1e4c2ec3544cd9b3de8 Mon Sep 17 00:00:00 2001 From: "Alessandro (Ale) Segala" <43508+ItalyPaleAle@users.noreply.github.com> Date: Wed, 17 May 2023 15:57:16 -0700 Subject: [PATCH 4/6] Update daprdocs/content/en/getting-started/quickstarts/bindings-quickstart.md Co-authored-by: Hannah Hunter <94493363+hhunter-ms@users.noreply.github.com> Signed-off-by: Alessandro (Ale) Segala <43508+ItalyPaleAle@users.noreply.github.com> --- .../en/getting-started/quickstarts/bindings-quickstart.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/daprdocs/content/en/getting-started/quickstarts/bindings-quickstart.md b/daprdocs/content/en/getting-started/quickstarts/bindings-quickstart.md index 377cf7d61..bac891b5a 100644 --- a/daprdocs/content/en/getting-started/quickstarts/bindings-quickstart.md +++ b/daprdocs/content/en/getting-started/quickstarts/bindings-quickstart.md @@ -11,7 +11,7 @@ Let's take a look at Dapr's [Bindings building block]({{< ref bindings >}}). Usi - Trigger your app with events coming in from external systems. - Interface with external systems. -In this Quickstart, you will schedule a batch script to run every 10 seconds using an input [Cron](https://docs.dapr.io/reference/components-reference/supported-bindings/cron/) binding. The script processes a JSON file and outputs data to a SQL database using the [PostgreSQL](https://docs.dapr.io/reference/components-reference/supported-bindings/postgresql) Dapr binding. +In this Quickstart, you will schedule a batch script to run every 10 seconds using an input [Cron]({{< ref cron.md >}}) binding. The script processes a JSON file and outputs data to a SQL database using the [PostgreSQL]({{< ref postgresql.md >}}) Dapr binding. From 0128abb13a0ce52131b68b84e31e11a2290f92a0 Mon Sep 17 00:00:00 2001 From: Josh van Leeuwen Date: Thu, 18 May 2023 00:12:13 +0100 Subject: [PATCH 5/6] Adds docs about using `ttlInSeconds` to actor state store reference API (#3392) * Adds note about using `ttlInSeconds` to actor state store reference API Signed-off-by: joshvanl * Update daprdocs/content/en/developing-applications/building-blocks/state-management/state-management-overview.md Co-authored-by: Hannah Hunter <94493363+hhunter-ms@users.noreply.github.com> Signed-off-by: Josh van Leeuwen * Update daprdocs/content/en/reference/api/actors_api.md Co-authored-by: Hannah Hunter <94493363+hhunter-ms@users.noreply.github.com> Signed-off-by: Josh van Leeuwen * Update daprdocs/content/en/reference/api/actors_api.md Co-authored-by: Hannah Hunter <94493363+hhunter-ms@users.noreply.github.com> Signed-off-by: Josh van Leeuwen * Embed the actor state TTL YouTube video in page Signed-off-by: joshvanl * Update daprdocs/content/en/developing-applications/building-blocks/state-management/state-management-overview.md Co-authored-by: Mark Fussell Signed-off-by: Josh van Leeuwen * Adds Actor State section to actor overview page Signed-off-by: joshvanl --------- Signed-off-by: joshvanl Signed-off-by: Josh van Leeuwen Co-authored-by: Hannah Hunter <94493363+hhunter-ms@users.noreply.github.com> Co-authored-by: Mark Fussell --- .../building-blocks/actors/actors-overview.md | 9 ++++++++- .../state-management/state-management-overview.md | 5 ++++- daprdocs/content/en/reference/api/actors_api.md | 13 ++++++++++++- 3 files changed, 24 insertions(+), 3 deletions(-) diff --git a/daprdocs/content/en/developing-applications/building-blocks/actors/actors-overview.md b/daprdocs/content/en/developing-applications/building-blocks/actors/actors-overview.md index bda74c2a5..eed874969 100644 --- a/daprdocs/content/en/developing-applications/building-blocks/actors/actors-overview.md +++ b/daprdocs/content/en/developing-applications/building-blocks/actors/actors-overview.md @@ -86,6 +86,13 @@ The Dapr actor runtime provides a simple turn-based access model for accessing a - [Learn more about actor reentrancy]({{< ref "actor-reentrancy.md" >}}) - [Learn more about the turn-based access model]({{< ref "actors-features-concepts.md#turn-based-access" >}}) +### State + +Transactional state stores can be used to store actor state. To specify which state store to use for actors, specify value of property `actorStateStore` as `true` in the state store component's metadata section. Actors state is stored with a specific scheme in transactional state stores, allowing for consistent querying. Only a single state store component can be used as the state store for all actors. Read the [state API reference]({{< ref state_api.md >}}) and the [actors API reference]({{< ref actors_api.md >}}) to learn more about state stores for actors. + +#### Time to Live (TTL) on state +You should always set the TTL metadata field (`ttlInSeconds`), or the equivalent API call in your chosen SDK when saving actor state to ensure that state eventually removed. Read [actors overview]({{< ref actors-overview.md >}}) for more information. + ### Actor timers and reminders Actors can schedule periodic work on themselves by registering either timers or reminders. @@ -105,4 +112,4 @@ This distinction allows users to trade off between light-weight but stateless ti ## Related links - [Actors API reference]({{< ref actors_api.md >}}) -- Refer to the [Dapr SDK documentation and examples]({{< ref "developing-applications/sdks/#sdk-languages" >}}). \ No newline at end of file +- Refer to the [Dapr SDK documentation and examples]({{< ref "developing-applications/sdks/#sdk-languages" >}}). diff --git a/daprdocs/content/en/developing-applications/building-blocks/state-management/state-management-overview.md b/daprdocs/content/en/developing-applications/building-blocks/state-management/state-management-overview.md index f9f5813f0..a7dacc361 100644 --- a/daprdocs/content/en/developing-applications/building-blocks/state-management/state-management-overview.md +++ b/daprdocs/content/en/developing-applications/building-blocks/state-management/state-management-overview.md @@ -93,6 +93,9 @@ You can group write, update, and delete operations into a request, which are the Transactional state stores can be used to store actor state. To specify which state store to use for actors, specify value of property `actorStateStore` as `true` in the state store component's metadata section. Actors state is stored with a specific scheme in transactional state stores, allowing for consistent querying. Only a single state store component can be used as the state store for all actors. Read the [state API reference]({{< ref state_api.md >}}) and the [actors API reference]({{< ref actors_api.md >}}) to learn more about state stores for actors. +#### Time to Live (TTL) on actor state +You should always set the TTL metadata field (`ttlInSeconds`), or the equivalent API call in your chosen SDK when saving actor state to ensure that state eventually removed. Read [actors overview]({{< ref actors-overview.md >}}) for more information. + ### State encryption Dapr supports automatic client encryption of application state with support for key rotations. This is supported on all Dapr state stores. For more info, read the [How-To: Encrypt application state]({{< ref howto-encrypt-state.md >}}) topic. @@ -178,4 +181,4 @@ Want to skip the quickstarts? Not a problem. You can try out the state managemen - [How-To: Build a stateful service]({{< ref howto-stateful-service.md >}}) - Review the list of [state store components]({{< ref supported-state-stores.md >}}) - Read the [state management API reference]({{< ref state_api.md >}}) -- Read the [actors API reference]({{< ref actors_api.md >}}) \ No newline at end of file +- Read the [actors API reference]({{< ref actors_api.md >}}) diff --git a/daprdocs/content/en/reference/api/actors_api.md b/daprdocs/content/en/reference/api/actors_api.md index 81ee3cf79..edd6a46ff 100644 --- a/daprdocs/content/en/reference/api/actors_api.md +++ b/daprdocs/content/en/reference/api/actors_api.md @@ -75,6 +75,14 @@ Persists the change to the state for an actor as a multi-item transaction. ***Note that this operation is dependant on a using state store component that supports multi-item transactions.*** +When putting state, _always_ set the `ttlInSeconds` field in the +metadata for each value, unless there is a state clean up process out of band of +Dapr. Omitting this field will result in the underlying Actor state store to +grow indefinitely. + +See the Dapr Community Call 80 recording for more details on actor state TTL. + + #### HTTP Request ``` @@ -109,7 +117,10 @@ curl -X POST http://localhost:3500/v1.0/actors/stormtrooper/50/state \ "operation": "upsert", "request": { "key": "key1", - "value": "myData" + "value": "myData", + "metadata": { + "ttlInSeconds": "3600" + } } }, { From 2840ed771423074246668d89f1687a1cdc60bb57 Mon Sep 17 00:00:00 2001 From: Hannah Hunter <94493363+hhunter-ms@users.noreply.github.com> Date: Wed, 17 May 2023 19:15:52 -0400 Subject: [PATCH 6/6] [Workflow] Add purge/pause/resume/raise event to docs (#3332) * add new methods to api doc Signed-off-by: Hannah Hunter * add methods to how to Signed-off-by: Hannah Hunter * updates per ryan Signed-off-by: Hannah Hunter * add raise event request Signed-off-by: Hannah Hunter * add raise event api to the howto Signed-off-by: Hannah Hunter * updates per Ryan; Signed-off-by: Hannah Hunter * whitespace/updates Signed-off-by: Hannah Hunter * add more description to raise event Signed-off-by: Hannah Hunter * clarify raise event and add to the dotnet sdk example Signed-off-by: Hannah Hunter * edit confusing text for raise event http Signed-off-by: Hannah Hunter * updates per Ryan Signed-off-by: Hannah Hunter * rearrange author workflows doc per Ryan suggestion Signed-off-by: Hannah Hunter * update workflowreference to startworkflowresponse Signed-off-by: Hannah Hunter * fix link Signed-off-by: Hannah Hunter * edit from ryan Signed-off-by: Hannah Hunter * part 1 of mark review Signed-off-by: Hannah Hunter * Update daprdocs/content/en/reference/api/workflow_api.md Co-authored-by: Chris Gillum Signed-off-by: Mark Fussell * updates per mark and chris Signed-off-by: Hannah Hunter * few more edits per mark Signed-off-by: Hannah Hunter --------- Signed-off-by: Hannah Hunter Signed-off-by: Hannah Hunter <94493363+hhunter-ms@users.noreply.github.com> Signed-off-by: Mark Fussell Co-authored-by: Mark Fussell Co-authored-by: Chris Gillum --- .../workflow/howto-author-workflow.md | 83 ++++++++++- .../workflow/howto-manage-workflow.md | 61 ++++++-- .../content/en/reference/api/workflow_api.md | 132 +++++++++++++++++- 3 files changed, 255 insertions(+), 21 deletions(-) diff --git a/daprdocs/content/en/developing-applications/building-blocks/workflow/howto-author-workflow.md b/daprdocs/content/en/developing-applications/building-blocks/workflow/howto-author-workflow.md index b9f7d2607..b6191d23d 100644 --- a/daprdocs/content/en/developing-applications/building-blocks/workflow/howto-author-workflow.md +++ b/daprdocs/content/en/developing-applications/building-blocks/workflow/howto-author-workflow.md @@ -28,19 +28,88 @@ The Dapr sidecar doesn’t load any workflow definitions. Rather, the sidecar si ## Write the workflow activities -Define the workflow activities you'd like your workflow to perform. Activities are a class definition and can take inputs and outputs. Activities also participate in dependency injection, like binding to a Dapr client. +[Workflow activities]({{< ref "workflow-features-concepts.md#workflow-activites" >}}) are the basic unit of work in a workflow and are the tasks that get orchestrated in the business process. {{< tabs ".NET" >}} {{% codetab %}} -Continuing the ASP.NET order processing example, the `OrderProcessingWorkflow` class is derived from a base class called `Workflow` with input and output parameter types. +Define the workflow activities you'd like your workflow to perform. Activities are a class definition and can take inputs and outputs. Activities also participate in dependency injection, like binding to a Dapr client. -It also includes a `RunAsync` method that does the heavy lifting of the workflow and calls the workflow activities. The activities called in the example are: +The activities called in the example below are: - `NotifyActivity`: Receive notification of a new order. - `ReserveInventoryActivity`: Check for sufficient inventory to meet the new order. - `ProcessPaymentActivity`: Process payment for the order. Includes `NotifyActivity` to send notification of successful order. +### NotifyActivity + +```csharp +public class NotifyActivity : WorkflowActivity +{ + //... + + public NotifyActivity(ILoggerFactory loggerFactory) + { + this.logger = loggerFactory.CreateLogger(); + } + + //... +} +``` + +[See the full `NotifyActivity.cs` workflow activity example.](https://github.com/dapr/dotnet-sdk/blob/master/examples/Workflow/WorkflowConsoleApp/Activities/NotifyActivity.cs) + +### ReserveInventoryActivity + +```csharp +public class ReserveInventoryActivity : WorkflowActivity +{ + //... + + public ReserveInventoryActivity(ILoggerFactory loggerFactory, DaprClient client) + { + this.logger = loggerFactory.CreateLogger(); + this.client = client; + } + + //... + +} +``` +[See the full `ReserveInventoryActivity.cs` workflow activity example.](https://github.com/dapr/dotnet-sdk/blob/master/examples/Workflow/WorkflowConsoleApp/Activities/ReserveInventoryActivity.cs) + +### ProcessPaymentActivity + +```csharp +public class ProcessPaymentActivity : WorkflowActivity +{ + //... + public ProcessPaymentActivity(ILoggerFactory loggerFactory) + { + this.logger = loggerFactory.CreateLogger(); + } + + //... + +} +``` + +[See the full `ProcessPaymentActivity.cs` workflow activity example.](https://github.com/dapr/dotnet-sdk/blob/master/examples/Workflow/WorkflowConsoleApp/Activities/ProcessPaymentActivity.cs) + +{{% /codetab %}} + +{{< /tabs >}} + +## Write the workflow + +Next, register and call the activites in a workflow. + +{{< tabs ".NET" >}} + +{{% codetab %}} + +The `OrderProcessingWorkflow` class is derived from a base class called `Workflow` with input and output parameter types. It also includes a `RunAsync` method that does the heavy lifting of the workflow and calls the workflow activities. + ```csharp class OrderProcessingWorkflow : Workflow { @@ -73,19 +142,21 @@ It also includes a `RunAsync` method that does the heavy lifting of the workflow } ``` +[See the full workflow example in `OrderProcessingWorkflow.cs`.](https://github.com/dapr/dotnet-sdk/blob/master/examples/Workflow/WorkflowConsoleApp/Workflows/OrderProcessingWorkflow.cs) + {{% /codetab %}} {{< /tabs >}} -## Write the workflow +## Write the application -Compose the workflow activities into a workflow. +Finally, compose the application using the workflow. {{< tabs ".NET" >}} {{% codetab %}} -[In the following example](https://github.com/dapr/dotnet-sdk/blob/master/examples/Workflow/WorkflowConsoleApp/Program.cs), for a basic ASP.NET order processing application using the .NET SDK, your project code would include: +[In the following `Program.cs` example](https://github.com/dapr/dotnet-sdk/blob/master/examples/Workflow/WorkflowConsoleApp/Program.cs), for a basic ASP.NET order processing application using the .NET SDK, your project code would include: - A NuGet package called `Dapr.Workflow` to receive the .NET SDK capabilities - A builder with an extension method called `AddDaprWorkflow` diff --git a/daprdocs/content/en/developing-applications/building-blocks/workflow/howto-manage-workflow.md b/daprdocs/content/en/developing-applications/building-blocks/workflow/howto-manage-workflow.md index 34aa10603..7b2af68b4 100644 --- a/daprdocs/content/en/developing-applications/building-blocks/workflow/howto-manage-workflow.md +++ b/daprdocs/content/en/developing-applications/building-blocks/workflow/howto-manage-workflow.md @@ -21,16 +21,27 @@ string workflowComponent = "dapr"; string workflowName = "OrderProcessingWorkflow"; OrderPayload input = new OrderPayload("Paperclips", 99.95); Dictionary workflowOptions; // This is an optional parameter -CancellationToken cts = CancellationToken.None; -// Start the workflow. This returns back a "WorkflowReference" which contains the instanceID for the particular workflow instance. -WorkflowReference startResponse = await daprClient.StartWorkflowAsync(orderId, workflowComponent, workflowName, input, workflowOptions, cts); +// Start the workflow. This returns back a "StartWorkflowResponse" which contains the instance ID for the particular workflow instance. +StartWorkflowResponse startResponse = await daprClient.StartWorkflowAsync(orderId, workflowComponent, workflowName, input, workflowOptions); -// Get information on the workflow. This response will contain information such as the status of the workflow, when it started, and more! +// Get information on the workflow. This response contains information such as the status of the workflow, when it started, and more! GetWorkflowResponse getResponse = await daprClient.GetWorkflowAsync(orderId, workflowComponent, workflowName); // Terminate the workflow -await daprClient.TerminateWorkflowAsync(instanceId, workflowComponent); +await daprClient.TerminateWorkflowAsync(orderId, workflowComponent); + +// Raise an event (an incoming purchase order) that your workflow will wait for. This returns the item waiting to be purchased. +await daprClient.RaiseWorkflowEventAsync(orderId, workflowComponent, workflowName, input); + +// Pause +await daprClient.PauseWorkflowAsync(orderId, workflowComponent); + +// Resume +await daprClient.ResumeWorkflowAsync(orderId, workflowComponent); + +// Purge +await daprClient.PurgeWorkflowAsync(orderId, workflowComponent); ``` {{% /codetab %}} @@ -44,7 +55,7 @@ Manage your workflow using HTTP calls. The example below plugs in the properties To start your workflow with an ID `12345678`, run: -```bash +```http POST http://localhost:3500/v1.0-alpha1/workflows/dapr/OrderProcessingWorkflow/start?instanceID=12345678 ``` @@ -54,15 +65,49 @@ Note that workflow instance IDs can only contain alphanumeric characters, unders To terminate your workflow with an ID `12345678`, run: -```bash +```http POST http://localhost:3500/v1.0-alpha1/workflows/dapr/12345678/terminate ``` +### Raise an event + +For workflow components that support subscribing to external events, such as the Dapr Workflow engine, you can use the following "raise event" API to deliver a named event to a specific workflow instance. + +```http +POST http://localhost:3500/v1.0-alpha1/workflows///raiseEvent/ +``` + +> An `eventName` can be any function. + +### Pause or resume a workflow + +To plan for down-time, wait for inputs, and more, you can pause and then resume a workflow. To pause a workflow with an ID `12345678` until triggered to resume, run: + +```http +POST http://localhost:3500/v1.0-alpha1/workflows/dapr/12345678/pause +``` + +To resume a workflow with an ID `12345678`, run: + +```http +POST http://localhost:3500/v1.0-alpha1/workflows/dapr/12345678/resume +``` + +### Purge a workflow + +The purge API can be used to permanently delete workflow metadata from the underlying state store, including any stored inputs, outputs, and workflow history records. This is often useful for implementing data retention policies and for freeing resources. + +Only workflow instances in the COMPLETED, FAILED, or TERMINATED state can be purged. If the workflow is in any other state, calling purge returns an error. + +```http +POST http://localhost:3500/v1.0-alpha1/workflows/dapr/12345678/purge +``` + ### Get information about a workflow To fetch workflow information (outputs and inputs) with an ID `12345678`, run: -```bash +```http GET http://localhost:3500/v1.0-alpha1/workflows/dapr/12345678 ``` diff --git a/daprdocs/content/en/reference/api/workflow_api.md b/daprdocs/content/en/reference/api/workflow_api.md index 442b206ad..4ddf8b720 100644 --- a/daprdocs/content/en/reference/api/workflow_api.md +++ b/daprdocs/content/en/reference/api/workflow_api.md @@ -12,7 +12,7 @@ Dapr provides users with the ability to interact with workflows and comes with a Start a workflow instance with the given name and optionally, an instance ID. -```bash +```http POST http://localhost:3500/v1.0-alpha1/workflows///start[?instanceId=] ``` @@ -22,7 +22,7 @@ Note that workflow instance IDs can only contain alphanumeric characters, unders Parameter | Description --------- | ----------- -`workflowComponentName` | Current default is `dapr` for Dapr Workflows +`workflowComponentName` | Use `dapr` for Dapr Workflows `workflowName` | Identify the workflow type `instanceId` | (Optional) Unique value created for each run of a specific workflow @@ -52,7 +52,7 @@ The API call will provide a response similar to this: Terminate a running workflow instance with the given name and instance ID. -```bash +```http POST http://localhost:3500/v1.0-alpha1/workflows//terminate ``` @@ -60,7 +60,7 @@ POST http://localhost:3500/v1.0-alpha1/workflows//terminate Parameter | Description --------- | ----------- -`workflowComponentName` | Current default is `dapr` for Dapr Workflows +`workflowComponentName` | Use `dapr` for Dapr Workflows `instanceId` | Unique value created for each run of a specific workflow ### HTTP response codes @@ -75,11 +75,125 @@ Code | Description This API does not return any content. -### Get workflow request +## Raise Event request + +For workflow components that support subscribing to external events, such as the Dapr Workflow engine, you can use the following "raise event" API to deliver a named event to a specific workflow instance. + +```http +POST http://localhost:3500/v1.0-alpha1/workflows///raiseEvent/ +``` + +{{% alert title="Note" color="primary" %}} + The exact mechanism for subscribing to an event depends on the workflow component that you're using. Dapr Workflow has one way of subscribing to external events but other workflow components might have different ways. + +{{% /alert %}} + +### URL parameters + +Parameter | Description +--------- | ----------- +`workflowComponentName` | Use `dapr` for Dapr Workflows +`instanceId` | Unique value created for each run of a specific workflow +`eventName` | The name of the event to raise + +### HTTP response codes + +Code | Description +---- | ----------- +`202` | Accepted +`400` | Request was malformed +`500` | Request formatted correctly, error in dapr code or underlying component + +### Response content + +None. + +## Pause workflow request + +Pause a running workflow instance. + +```http +POST http://localhost:3500/v1.0-alpha1/workflows///pause +``` + +### URL parameters + +Parameter | Description +--------- | ----------- +`workflowComponentName` | Use `dapr` for Dapr Workflows +`instanceId` | Unique value created for each run of a specific workflow + +### HTTP response codes + +Code | Description +---- | ----------- +`202` | Accepted +`400` | Request was malformed +`500` | Error in Dapr code or underlying component + +### Response content + +None. + +## Resume workflow request + +Resume a paused workflow instance. + +```http +POST http://localhost:3500/v1.0-alpha1/workflows///resume +``` + +### URL parameters + +Parameter | Description +--------- | ----------- +`workflowComponentName` | Use `dapr` for Dapr Workflows +`instanceId` | Unique value created for each run of a specific workflow + +### HTTP response codes + +Code | Description +---- | ----------- +`202` | Accepted +`400` | Request was malformed +`500` | Error in Dapr code or underlying component + +### Response content + +None. + +## Purge workflow request + +Purge the workflow state from your state store with the workflow's instance ID. + +```http +POST http://localhost:3500/v1.0-alpha1/workflows///purge +``` + +### URL parameters + +Parameter | Description +--------- | ----------- +`workflowComponentName` | Use `dapr` for Dapr Workflows +`instanceId` | Unique value created for each run of a specific workflow + +### HTTP response codes + +Code | Description +---- | ----------- +`202` | Accepted +`400` | Request was malformed +`500` | Error in Dapr code or underlying component + +### Response content + +None. + +## Get workflow request Get information about a given workflow instance. -```bash +```http GET http://localhost:3500/v1.0-alpha1/workflows// ``` @@ -87,7 +201,7 @@ GET http://localhost:3500/v1.0-alpha1/workflows//