Merge branch 'v1.11' into fix-3329

This commit is contained in:
Hannah Hunter 2023-05-17 19:35:30 -04:00 committed by GitHub
commit b1bf246149
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
14 changed files with 368 additions and 82 deletions

View File

@ -36,7 +36,7 @@ Each of these building block APIs is independent, meaning that you can use one,
| Building Block | Description |
|----------------|-------------|
| [**Service-to-service invocation**]({{< ref "service-invocation-overview.md" >}}) | Resilient service-to-service invocation enables method calls, including retries, on remote services, wherever they are located in the supported hosting environment.
| [**State management**]({{< ref "state-management-overview.md" >}}) | With state management for storing and querying key/value pairs, long-running, highly available, stateful services can be easily written alongside stateless services in your application. The state store is pluggable and examples include AWS DynamoDB, Azure CosmosDB, Azure SQL Server, GCP Firebase, PostgreSQL or Redis, among others.
| [**State management**]({{< ref "state-management-overview.md" >}}) | With state management for storing and querying key/value pairs, long-running, highly available, stateful services can be easily written alongside stateless services in your application. The state store is pluggable and examples include AWS DynamoDB, Azure Cosmos DB, Azure SQL Server, GCP Firebase, PostgreSQL or Redis, among others.
| [**Publish and subscribe**]({{< ref "pubsub-overview.md" >}}) | Publishing events and subscribing to topics between services enables event-driven architectures to simplify horizontal scalability and make them resilient to failure. Dapr provides at-least-once message delivery guarantee, message TTL, consumer groups and other advance features.
| [**Resource bindings**]({{< ref "bindings-overview.md" >}}) | Resource bindings with triggers builds further on event-driven architectures for scale and resiliency by receiving and sending events to and from any external source such as databases, queues, file systems, etc.
| [**Actors**]({{< ref "actors-overview.md" >}}) | A pattern for stateful and stateless objects that makes concurrency simple, with method and state encapsulation. Dapr provides many capabilities in its actor runtime, including concurrency, state, and life-cycle management for actor activation/deactivation, and timers and reminders to wake up actors.

View File

@ -86,6 +86,13 @@ The Dapr actor runtime provides a simple turn-based access model for accessing a
- [Learn more about actor reentrancy]({{< ref "actor-reentrancy.md" >}})
- [Learn more about the turn-based access model]({{< ref "actors-features-concepts.md#turn-based-access" >}})
### State
Transactional state stores can be used to store actor state. To specify which state store to use for actors, specify value of property `actorStateStore` as `true` in the state store component's metadata section. Actors state is stored with a specific scheme in transactional state stores, allowing for consistent querying. Only a single state store component can be used as the state store for all actors. Read the [state API reference]({{< ref state_api.md >}}) and the [actors API reference]({{< ref actors_api.md >}}) to learn more about state stores for actors.
#### Time to Live (TTL) on state
You should always set the TTL metadata field (`ttlInSeconds`), or the equivalent API call in your chosen SDK when saving actor state to ensure that state eventually removed. Read [actors overview]({{< ref actors-overview.md >}}) for more information.
### Actor timers and reminders
Actors can schedule periodic work on themselves by registering either timers or reminders.
@ -105,4 +112,4 @@ This distinction allows users to trade off between light-weight but stateless ti
## Related links
- [Actors API reference]({{< ref actors_api.md >}})
- Refer to the [Dapr SDK documentation and examples]({{< ref "developing-applications/sdks/#sdk-languages" >}}).
- Refer to the [Dapr SDK documentation and examples]({{< ref "developing-applications/sdks/#sdk-languages" >}}).

View File

@ -93,6 +93,9 @@ You can group write, update, and delete operations into a request, which are the
Transactional state stores can be used to store actor state. To specify which state store to use for actors, specify value of property `actorStateStore` as `true` in the state store component's metadata section. Actors state is stored with a specific scheme in transactional state stores, allowing for consistent querying. Only a single state store component can be used as the state store for all actors. Read the [state API reference]({{< ref state_api.md >}}) and the [actors API reference]({{< ref actors_api.md >}}) to learn more about state stores for actors.
#### Time to Live (TTL) on actor state
You should always set the TTL metadata field (`ttlInSeconds`), or the equivalent API call in your chosen SDK when saving actor state to ensure that state eventually removed. Read [actors overview]({{< ref actors-overview.md >}}) for more information.
### State encryption
Dapr supports automatic client encryption of application state with support for key rotations. This is supported on all Dapr state stores. For more info, read the [How-To: Encrypt application state]({{< ref howto-encrypt-state.md >}}) topic.
@ -178,4 +181,4 @@ Want to skip the quickstarts? Not a problem. You can try out the state managemen
- [How-To: Build a stateful service]({{< ref howto-stateful-service.md >}})
- Review the list of [state store components]({{< ref supported-state-stores.md >}})
- Read the [state management API reference]({{< ref state_api.md >}})
- Read the [actors API reference]({{< ref actors_api.md >}})
- Read the [actors API reference]({{< ref actors_api.md >}})

View File

@ -28,19 +28,88 @@ The Dapr sidecar doesnt load any workflow definitions. Rather, the sidecar si
## Write the workflow activities
Define the workflow activities you'd like your workflow to perform. Activities are a class definition and can take inputs and outputs. Activities also participate in dependency injection, like binding to a Dapr client.
[Workflow activities]({{< ref "workflow-features-concepts.md#workflow-activites" >}}) are the basic unit of work in a workflow and are the tasks that get orchestrated in the business process.
{{< tabs ".NET" >}}
{{% codetab %}}
Continuing the ASP.NET order processing example, the `OrderProcessingWorkflow` class is derived from a base class called `Workflow` with input and output parameter types.
Define the workflow activities you'd like your workflow to perform. Activities are a class definition and can take inputs and outputs. Activities also participate in dependency injection, like binding to a Dapr client.
It also includes a `RunAsync` method that does the heavy lifting of the workflow and calls the workflow activities. The activities called in the example are:
The activities called in the example below are:
- `NotifyActivity`: Receive notification of a new order.
- `ReserveInventoryActivity`: Check for sufficient inventory to meet the new order.
- `ProcessPaymentActivity`: Process payment for the order. Includes `NotifyActivity` to send notification of successful order.
### NotifyActivity
```csharp
public class NotifyActivity : WorkflowActivity<Notification, object>
{
//...
public NotifyActivity(ILoggerFactory loggerFactory)
{
this.logger = loggerFactory.CreateLogger<NotifyActivity>();
}
//...
}
```
[See the full `NotifyActivity.cs` workflow activity example.](https://github.com/dapr/dotnet-sdk/blob/master/examples/Workflow/WorkflowConsoleApp/Activities/NotifyActivity.cs)
### ReserveInventoryActivity
```csharp
public class ReserveInventoryActivity : WorkflowActivity<InventoryRequest, InventoryResult>
{
//...
public ReserveInventoryActivity(ILoggerFactory loggerFactory, DaprClient client)
{
this.logger = loggerFactory.CreateLogger<ReserveInventoryActivity>();
this.client = client;
}
//...
}
```
[See the full `ReserveInventoryActivity.cs` workflow activity example.](https://github.com/dapr/dotnet-sdk/blob/master/examples/Workflow/WorkflowConsoleApp/Activities/ReserveInventoryActivity.cs)
### ProcessPaymentActivity
```csharp
public class ProcessPaymentActivity : WorkflowActivity<PaymentRequest, object>
{
//...
public ProcessPaymentActivity(ILoggerFactory loggerFactory)
{
this.logger = loggerFactory.CreateLogger<ProcessPaymentActivity>();
}
//...
}
```
[See the full `ProcessPaymentActivity.cs` workflow activity example.](https://github.com/dapr/dotnet-sdk/blob/master/examples/Workflow/WorkflowConsoleApp/Activities/ProcessPaymentActivity.cs)
{{% /codetab %}}
{{< /tabs >}}
## Write the workflow
Next, register and call the activites in a workflow.
{{< tabs ".NET" >}}
{{% codetab %}}
The `OrderProcessingWorkflow` class is derived from a base class called `Workflow` with input and output parameter types. It also includes a `RunAsync` method that does the heavy lifting of the workflow and calls the workflow activities.
```csharp
class OrderProcessingWorkflow : Workflow<OrderPayload, OrderResult>
{
@ -73,19 +142,21 @@ It also includes a `RunAsync` method that does the heavy lifting of the workflow
}
```
[See the full workflow example in `OrderProcessingWorkflow.cs`.](https://github.com/dapr/dotnet-sdk/blob/master/examples/Workflow/WorkflowConsoleApp/Workflows/OrderProcessingWorkflow.cs)
{{% /codetab %}}
{{< /tabs >}}
## Write the workflow
## Write the application
Compose the workflow activities into a workflow.
Finally, compose the application using the workflow.
{{< tabs ".NET" >}}
{{% codetab %}}
[In the following example](https://github.com/dapr/dotnet-sdk/blob/master/examples/Workflow/WorkflowConsoleApp/Program.cs), for a basic ASP.NET order processing application using the .NET SDK, your project code would include:
[In the following `Program.cs` example](https://github.com/dapr/dotnet-sdk/blob/master/examples/Workflow/WorkflowConsoleApp/Program.cs), for a basic ASP.NET order processing application using the .NET SDK, your project code would include:
- A NuGet package called `Dapr.Workflow` to receive the .NET SDK capabilities
- A builder with an extension method called `AddDaprWorkflow`

View File

@ -21,16 +21,27 @@ string workflowComponent = "dapr";
string workflowName = "OrderProcessingWorkflow";
OrderPayload input = new OrderPayload("Paperclips", 99.95);
Dictionary<string, string> workflowOptions; // This is an optional parameter
CancellationToken cts = CancellationToken.None;
// Start the workflow. This returns back a "WorkflowReference" which contains the instanceID for the particular workflow instance.
WorkflowReference startResponse = await daprClient.StartWorkflowAsync(orderId, workflowComponent, workflowName, input, workflowOptions, cts);
// Start the workflow. This returns back a "StartWorkflowResponse" which contains the instance ID for the particular workflow instance.
StartWorkflowResponse startResponse = await daprClient.StartWorkflowAsync(orderId, workflowComponent, workflowName, input, workflowOptions);
// Get information on the workflow. This response will contain information such as the status of the workflow, when it started, and more!
// Get information on the workflow. This response contains information such as the status of the workflow, when it started, and more!
GetWorkflowResponse getResponse = await daprClient.GetWorkflowAsync(orderId, workflowComponent, workflowName);
// Terminate the workflow
await daprClient.TerminateWorkflowAsync(instanceId, workflowComponent);
await daprClient.TerminateWorkflowAsync(orderId, workflowComponent);
// Raise an event (an incoming purchase order) that your workflow will wait for. This returns the item waiting to be purchased.
await daprClient.RaiseWorkflowEventAsync(orderId, workflowComponent, workflowName, input);
// Pause
await daprClient.PauseWorkflowAsync(orderId, workflowComponent);
// Resume
await daprClient.ResumeWorkflowAsync(orderId, workflowComponent);
// Purge
await daprClient.PurgeWorkflowAsync(orderId, workflowComponent);
```
{{% /codetab %}}
@ -44,7 +55,7 @@ Manage your workflow using HTTP calls. The example below plugs in the properties
To start your workflow with an ID `12345678`, run:
```bash
```http
POST http://localhost:3500/v1.0-alpha1/workflows/dapr/OrderProcessingWorkflow/start?instanceID=12345678
```
@ -54,15 +65,49 @@ Note that workflow instance IDs can only contain alphanumeric characters, unders
To terminate your workflow with an ID `12345678`, run:
```bash
```http
POST http://localhost:3500/v1.0-alpha1/workflows/dapr/12345678/terminate
```
### Raise an event
For workflow components that support subscribing to external events, such as the Dapr Workflow engine, you can use the following "raise event" API to deliver a named event to a specific workflow instance.
```http
POST http://localhost:3500/v1.0-alpha1/workflows/<workflowComponentName>/<instanceID>/raiseEvent/<eventName>
```
> An `eventName` can be any function.
### Pause or resume a workflow
To plan for down-time, wait for inputs, and more, you can pause and then resume a workflow. To pause a workflow with an ID `12345678` until triggered to resume, run:
```http
POST http://localhost:3500/v1.0-alpha1/workflows/dapr/12345678/pause
```
To resume a workflow with an ID `12345678`, run:
```http
POST http://localhost:3500/v1.0-alpha1/workflows/dapr/12345678/resume
```
### Purge a workflow
The purge API can be used to permanently delete workflow metadata from the underlying state store, including any stored inputs, outputs, and workflow history records. This is often useful for implementing data retention policies and for freeing resources.
Only workflow instances in the COMPLETED, FAILED, or TERMINATED state can be purged. If the workflow is in any other state, calling purge returns an error.
```http
POST http://localhost:3500/v1.0-alpha1/workflows/dapr/12345678/purge
```
### Get information about a workflow
To fetch workflow information (outputs and inputs) with an ID `12345678`, run:
```bash
```http
GET http://localhost:3500/v1.0-alpha1/workflows/dapr/12345678
```

View File

@ -105,6 +105,36 @@ Dapr Workflows allow you to schedule reminder-like durable delays for any time r
Some APIs in the workflow authoring SDK may internally schedule durable timers to implement internal timeout behavior.
{{% /alert %}}
## Retry policies
Workflows support durable retry policies for activities and child workflows. Workflow retry policies are separate and distinct from [Dapr resiliency policies]({{< ref "resiliency-overview.md" >}}) in the following ways.
- Workflow retry policies are configured by the workflow author in code, whereas Dapr Resiliency policies are configured by the application operator in YAML.
- Workflow retry policies are durable and maintain their state across application restarts, whereas Dapr Resiliency policies are not durable and must be re-applied after application restarts.
- Workflow retry policies are triggered by unhandled errors/exceptions in activities and child workflows, whereas Dapr Resiliency policies are triggered by operation timeouts and connectivity faults.
Retries are internally implemented using durable timers. This means that workflows can be safely unloaded from memory while waiting for a retry to fire, conserving system resources. This also means that delays between retries can be arbitrarily long, including minutes, hours, or even days.
{{% alert title="Note" color="primary" %}}
The actions performed by a retry policy are saved into a workflow's history. Care must be taken not to change the behavior of a retry policy after a workflow has already been executed. Otherwise, the workflow may behave unexpectedly when replayed. See the notes on [updating workflow code]({{< ref "#updating-workflow-code" >}}) for more information.
{{% /alert %}}
It's possible to use both workflow retry policies and Dapr Resiliency policies together. For example, if a workflow activity uses a Dapr client to invoke a service, the Dapr client uses the configured resiliency policy. See [Quickstart: Service-to-service resiliency]({{< ref "#resiliency-serviceinvo-quickstart" >}}) for more information with an example. However, if the activity itself fails for any reason, including exhausting the retries on the resiliency policy, then the workflow's resiliency policy kicks in.
{{% alert title="Note" color="primary" %}}
Using workflow retry policies and resiliency policies together can result in unexpected behavior. For example, if a workflow activity exhausts its configured retry policy, the workflow engine will still retry the activity according to the workflow retry policy. This can result in the activity being retried more times than expected.
{{% /alert %}}
Because workflow retry policies are configured in code, the exact developer experience may vary depending on the version of the workflow SDK. In general, workflow retry policies can be configured with the following parameters.
| Parameter | Description |
| --- | --- |
| **Maximum number of attempts** | The maximum number of times to execute the activity or child workflow. |
| **First retry interval** | The amount of time to wait before the first retry. |
| **Backoff coefficient** | The amount of time to wait before each subsequent retry. |
| **Maximum retry interval** | The maximum amount of time to wait before each subsequent retry. |
| **Retry timeout** | The overall timeout for retries, regardless of any configured max number of attempts. |
## External events
Sometimes workflows will need to wait for events that are raised by external systems. For example, an approval workflow may require a human to explicitly approve an order request within an order processing workflow if the total cost exceeds some threshold. Another example is a trivia game orchestration workflow that pauses while waiting for all participants to submit their answers to trivia questions. These mid-execution inputs are referred to as _external events_.

View File

@ -11,7 +11,7 @@ Let's take a look at Dapr's [Bindings building block]({{< ref bindings >}}). Usi
- Trigger your app with events coming in from external systems.
- Interface with external systems.
In this Quickstart, you will schedule a batch script to run every 10 seconds using an input [Cron](https://docs.dapr.io/reference/components-reference/supported-bindings/cron/) binding. The script processes a JSON file and outputs data to a SQL database using the [PostgreSQL](https://docs.dapr.io/reference/components-reference/supported-bindings/postgres) Dapr binding.
In this Quickstart, you will schedule a batch script to run every 10 seconds using an input [Cron]({{< ref cron.md >}}) binding. The script processes a JSON file and outputs data to a SQL database using the [PostgreSQL]({{< ref postgresql.md >}}) Dapr binding.
<img src="/images/bindings-quickstart/bindings-quickstart.png" width=800 style="padding-bottom:15px;">
@ -98,7 +98,7 @@ The code inside the `process_batch` function is executed every 10 seconds (defin
def process_batch():
```
The `batch-sdk` service uses the PostgreSQL output binding defined in the [`binding-postgres.yaml`]({{< ref "#componentbinding-postgresyaml-component-file" >}}) component to insert the `OrderId`, `Customer`, and `Price` records into the `orders` table.
The `batch-sdk` service uses the PostgreSQL output binding defined in the [`binding-postgresql.yaml`]({{< ref "#componentbinding-postgresyaml-component-file" >}}) component to insert the `OrderId`, `Customer`, and `Price` records into the `orders` table.
```python
with DaprClient() as d:
@ -140,7 +140,7 @@ In a new terminal, verify the same data has been inserted into the database. Nav
cd bindings/db
```
Run the following to start the interactive Postgres CLI:
Run the following to start the interactive *psql* CLI:
```bash
docker exec -i -t postgres psql --username postgres -p 5432 -h localhost --no-password
@ -193,16 +193,16 @@ spec:
**Note:** The `metadata` section of `binding-cron.yaml` contains a [Cron expression]({{< ref cron.md >}}) that specifies how often the binding is invoked.
#### `component\binding-postgres.yaml` component file
#### `component\binding-postgresql.yaml` component file
When you execute the `dapr run` command and specify the component path, the Dapr sidecar:
- Initiates the PostgreSQL [binding building block]({{< ref postgres.md >}})
- Connects to PostgreSQL using the settings specified in the `binding-postgres.yaml` file
- Initiates the PostgreSQL [binding building block]({{< ref postgresql.md >}})
- Connects to PostgreSQL using the settings specified in the `binding-postgresql.yaml` file
With the `binding-postgres.yaml` component, you can easily swap out the backend database [binding]({{< ref supported-bindings.md >}}) without making code changes.
With the `binding-postgresql.yaml` component, you can easily swap out the backend database [binding]({{< ref supported-bindings.md >}}) without making code changes.
The PostgreSQL `binding-postgres.yaml` file included for this Quickstart contains the following:
The PostgreSQL `binding-postgresql.yaml` file included for this Quickstart contains the following:
```yaml
apiVersion: dapr.io/v1alpha1
@ -211,7 +211,7 @@ metadata:
name: sqldb
namespace: quickstarts
spec:
type: bindings.postgres
type: bindings.postgresql
version: v1
metadata:
- name: url # Required
@ -304,7 +304,7 @@ async function start() {
}
```
The `batch-sdk` service uses the PostgreSQL output binding defined in the [`binding-postgres.yaml`]({{< ref "##componentsbinding-postgresyaml-component-file" >}}) component to insert the `OrderId`, `Customer`, and `Price` records into the `orders` table.
The `batch-sdk` service uses the PostgreSQL output binding defined in the [`binding-postgresql.yaml`]({{< ref "##componentsbinding-postgresyaml-component-file" >}}) component to insert the `OrderId`, `Customer`, and `Price` records into the `orders` table.
```javascript
async function processBatch(){
@ -395,16 +395,16 @@ spec:
**Note:** The `metadata` section of `binding-cron.yaml` contains a [Cron expression]({{< ref cron.md >}}) that specifies how often the binding is invoked.
#### `component\binding-postgres.yaml` component file
#### `component\binding-postgresql.yaml` component file
When you execute the `dapr run` command and specify the component path, the Dapr sidecar:
- Initiates the PostgreSQL [binding building block]({{< ref postgres.md >}})
- Connects to PostgreSQL using the settings specified in the `binding-postgres.yaml` file
- Initiates the PostgreSQL [binding building block]({{< ref postgresql.md >}})
- Connects to PostgreSQL using the settings specified in the `binding-postgresql.yaml` file
With the `binding-postgres.yaml` component, you can easily swap out the backend database [binding]({{< ref supported-bindings.md >}}) without making code changes.
With the `binding-postgresql.yaml` component, you can easily swap out the backend database [binding]({{< ref supported-bindings.md >}}) without making code changes.
The PostgreSQL `binding-postgres.yaml` file included for this Quickstart contains the following:
The PostgreSQL `binding-postgresql.yaml` file included for this Quickstart contains the following:
```yaml
apiVersion: dapr.io/v1alpha1
@ -413,7 +413,7 @@ metadata:
name: sqldb
namespace: quickstarts
spec:
type: bindings.postgres
type: bindings.postgresql
version: v1
metadata:
- name: url # Required
@ -506,7 +506,7 @@ app.MapPost("/" + cronBindingName, async () => {
});
```
The `batch-sdk` service uses the PostgreSQL output binding defined in the [`binding-postgres.yaml`]({{< ref "#componentbinding-postgresyaml-component-file" >}}) component to insert the `OrderId`, `Customer`, and `Price` records into the `orders` table.
The `batch-sdk` service uses the PostgreSQL output binding defined in the [`binding-postgresql.yaml`]({{< ref "#componentbinding-postgresyaml-component-file" >}}) component to insert the `OrderId`, `Customer`, and `Price` records into the `orders` table.
```csharp
// ...
@ -599,16 +599,16 @@ spec:
**Note:** The `metadata` section of `binding-cron.yaml` contains a [Cron expression]({{< ref cron.md >}}) that specifies how often the binding is invoked.
#### `component\binding-postgres.yaml` component file
#### `component\binding-postgresql.yaml` component file
When you execute the `dapr run` command and specify the component path, the Dapr sidecar:
- Initiates the PostgreSQL [binding building block]({{< ref postgres.md >}})
- Connects to PostgreSQL using the settings specified in the `binding-postgres.yaml` file
- Initiates the PostgreSQL [binding building block]({{< ref postgresql.md >}})
- Connects to PostgreSQL using the settings specified in the `binding-postgresql.yaml` file
With the `binding-postgres.yaml` component, you can easily swap out the backend database [binding]({{< ref supported-bindings.md >}}) without making code changes.
With the `binding-postgresql.yaml` component, you can easily swap out the backend database [binding]({{< ref supported-bindings.md >}}) without making code changes.
The PostgreSQL `binding-postgres.yaml` file included for this Quickstart contains the following:
The PostgreSQL `binding-postgresql.yaml` file included for this Quickstart contains the following:
```yaml
apiVersion: dapr.io/v1alpha1
@ -617,7 +617,7 @@ metadata:
name: sqldb
namespace: quickstarts
spec:
type: bindings.postgres
type: bindings.postgresql
version: v1
metadata:
- name: url # Required
@ -711,7 +711,7 @@ The code inside the `process_batch` function is executed every 10 seconds (defin
public ResponseEntity<String> processBatch() throws IOException, Exception
```
The `batch-sdk` service uses the PostgreSQL output binding defined in the [`binding-postgres.yaml`]({{< ref "#componentbinding-postgresyaml-component-file" >}}) component to insert the `OrderId`, `Customer`, and `Price` records into the `orders` table.
The `batch-sdk` service uses the PostgreSQL output binding defined in the [`binding-postgresql.yaml`]({{< ref "#componentbinding-postgresyaml-component-file" >}}) component to insert the `OrderId`, `Customer`, and `Price` records into the `orders` table.
```java
try (DaprClient client = new DaprClientBuilder().build()) {
@ -809,16 +809,16 @@ spec:
**Note:** The `metadata` section of `binding-cron.yaml` contains a [Cron expression]({{< ref cron.md >}}) that specifies how often the binding is invoked.
#### `component\binding-postgres.yaml` component file
#### `component\binding-postgresql.yaml` component file
When you execute the `dapr run` command and specify the component path, the Dapr sidecar:
- Initiates the PostgreSQL [binding building block]({{< ref postgres.md >}})
- Connects to PostgreSQL using the settings specified in the `binding-postgres.yaml` file
- Initiates the PostgreSQL [binding building block]({{< ref postgresql.md >}})
- Connects to PostgreSQL using the settings specified in the `binding-postgresql.yaml` file
With the `binding-postgres.yaml` component, you can easily swap out the backend database [binding]({{< ref supported-bindings.md >}}) without making code changes.
With the `binding-postgresql.yaml` component, you can easily swap out the backend database [binding]({{< ref supported-bindings.md >}}) without making code changes.
The PostgreSQL `binding-postgres.yaml` file included for this Quickstart contains the following:
The PostgreSQL `binding-postgresql.yaml` file included for this Quickstart contains the following:
```yaml
apiVersion: dapr.io/v1alpha1
@ -827,7 +827,7 @@ metadata:
name: sqldb
namespace: quickstarts
spec:
type: bindings.postgres
type: bindings.postgresql
version: v1
metadata:
- name: url # Required
@ -918,7 +918,7 @@ The code inside the `process_batch` function is executed every 10 seconds (defin
r.HandleFunc("/"+cronBindingName, processBatch).Methods("POST")
```
The `batch-sdk` service uses the PostgreSQL output binding defined in the [`binding-postgres.yaml`]({{< ref "#componentbinding-postgresyaml-component-file" >}}) component to insert the `OrderId`, `Customer`, and `Price` records into the `orders` table.
The `batch-sdk` service uses the PostgreSQL output binding defined in the [`binding-postgresql.yaml`]({{< ref "#componentbinding-postgresyaml-component-file" >}}) component to insert the `OrderId`, `Customer`, and `Price` records into the `orders` table.
```go
func sqlOutput(order Order) (err error) {
@ -1021,16 +1021,16 @@ spec:
**Note:** The `metadata` section of `binding-cron.yaml` contains a [Cron expression]({{< ref cron.md >}}) that specifies how often the binding is invoked.
#### `component\binding-postgres.yaml` component file
#### `component\binding-postgresql.yaml` component file
When you execute the `dapr run` command and specify the component path, the Dapr sidecar:
- Initiates the PostgreSQL [binding building block]({{< ref postgres.md >}})
- Connects to PostgreSQL using the settings specified in the `binding-postgres.yaml` file
- Initiates the PostgreSQL [binding building block]({{< ref postgresql.md >}})
- Connects to PostgreSQL using the settings specified in the `binding-postgresql.yaml` file
With the `binding-postgres.yaml` component, you can easily swap out the backend database [binding]({{< ref supported-bindings.md >}}) without making code changes.
With the `binding-postgresql.yaml` component, you can easily swap out the backend database [binding]({{< ref supported-bindings.md >}}) without making code changes.
The PostgreSQL `binding-postgres.yaml` file included for this Quickstart contains the following:
The PostgreSQL `binding-postgresql.yaml` file included for this Quickstart contains the following:
```yaml
apiVersion: dapr.io/v1alpha1
@ -1039,7 +1039,7 @@ metadata:
name: sqldb
namespace: quickstarts
spec:
type: bindings.postgres
type: bindings.postgresql
version: v1
metadata:
- name: url # Required

View File

@ -75,6 +75,14 @@ Persists the change to the state for an actor as a multi-item transaction.
***Note that this operation is dependant on a using state store component that supports multi-item transactions.***
When putting state, _always_ set the `ttlInSeconds` field in the
metadata for each value, unless there is a state clean up process out of band of
Dapr. Omitting this field will result in the underlying Actor state store to
grow indefinitely.
See the Dapr Community Call 80 recording for more details on actor state TTL.
<iframe width="560" height="315" src="https://www.youtube-nocookie.com/embed/kVpQYkGemRc?start=28" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>
#### HTTP Request
```
@ -109,7 +117,10 @@ curl -X POST http://localhost:3500/v1.0/actors/stormtrooper/50/state \
"operation": "upsert",
"request": {
"key": "key1",
"value": "myData"
"value": "myData",
"metadata": {
"ttlInSeconds": "3600"
}
}
},
{

View File

@ -12,7 +12,7 @@ Dapr provides users with the ability to interact with workflows and comes with a
Start a workflow instance with the given name and optionally, an instance ID.
```bash
```http
POST http://localhost:3500/v1.0-alpha1/workflows/<workflowComponentName>/<workflowName>/start[?instanceId=<instanceId>]
```
@ -22,7 +22,7 @@ Note that workflow instance IDs can only contain alphanumeric characters, unders
Parameter | Description
--------- | -----------
`workflowComponentName` | Current default is `dapr` for Dapr Workflows
`workflowComponentName` | Use `dapr` for Dapr Workflows
`workflowName` | Identify the workflow type
`instanceId` | (Optional) Unique value created for each run of a specific workflow
@ -52,7 +52,7 @@ The API call will provide a response similar to this:
Terminate a running workflow instance with the given name and instance ID.
```bash
```http
POST http://localhost:3500/v1.0-alpha1/workflows/<instanceId>/terminate
```
@ -60,7 +60,7 @@ POST http://localhost:3500/v1.0-alpha1/workflows/<instanceId>/terminate
Parameter | Description
--------- | -----------
`workflowComponentName` | Current default is `dapr` for Dapr Workflows
`workflowComponentName` | Use `dapr` for Dapr Workflows
`instanceId` | Unique value created for each run of a specific workflow
### HTTP response codes
@ -75,11 +75,125 @@ Code | Description
This API does not return any content.
### Get workflow request
## Raise Event request
For workflow components that support subscribing to external events, such as the Dapr Workflow engine, you can use the following "raise event" API to deliver a named event to a specific workflow instance.
```http
POST http://localhost:3500/v1.0-alpha1/workflows/<workflowComponentName>/<instanceID>/raiseEvent/<eventName>
```
{{% alert title="Note" color="primary" %}}
The exact mechanism for subscribing to an event depends on the workflow component that you're using. Dapr Workflow has one way of subscribing to external events but other workflow components might have different ways.
{{% /alert %}}
### URL parameters
Parameter | Description
--------- | -----------
`workflowComponentName` | Use `dapr` for Dapr Workflows
`instanceId` | Unique value created for each run of a specific workflow
`eventName` | The name of the event to raise
### HTTP response codes
Code | Description
---- | -----------
`202` | Accepted
`400` | Request was malformed
`500` | Request formatted correctly, error in dapr code or underlying component
### Response content
None.
## Pause workflow request
Pause a running workflow instance.
```http
POST http://localhost:3500/v1.0-alpha1/workflows/<workflowComponentName>/<instanceId>/pause
```
### URL parameters
Parameter | Description
--------- | -----------
`workflowComponentName` | Use `dapr` for Dapr Workflows
`instanceId` | Unique value created for each run of a specific workflow
### HTTP response codes
Code | Description
---- | -----------
`202` | Accepted
`400` | Request was malformed
`500` | Error in Dapr code or underlying component
### Response content
None.
## Resume workflow request
Resume a paused workflow instance.
```http
POST http://localhost:3500/v1.0-alpha1/workflows/<workflowComponentName>/<instanceId>/resume
```
### URL parameters
Parameter | Description
--------- | -----------
`workflowComponentName` | Use `dapr` for Dapr Workflows
`instanceId` | Unique value created for each run of a specific workflow
### HTTP response codes
Code | Description
---- | -----------
`202` | Accepted
`400` | Request was malformed
`500` | Error in Dapr code or underlying component
### Response content
None.
## Purge workflow request
Purge the workflow state from your state store with the workflow's instance ID.
```http
POST http://localhost:3500/v1.0-alpha1/workflows/<workflowComponentName>/<instanceId>/purge
```
### URL parameters
Parameter | Description
--------- | -----------
`workflowComponentName` | Use `dapr` for Dapr Workflows
`instanceId` | Unique value created for each run of a specific workflow
### HTTP response codes
Code | Description
---- | -----------
`202` | Accepted
`400` | Request was malformed
`500` | Error in Dapr code or underlying component
### Response content
None.
## Get workflow request
Get information about a given workflow instance.
```bash
```http
GET http://localhost:3500/v1.0-alpha1/workflows/<workflowComponentName>/<instanceId>
```
@ -87,7 +201,7 @@ GET http://localhost:3500/v1.0-alpha1/workflows/<workflowComponentName>/<instanc
Parameter | Description
--------- | -----------
`workflowComponentName` | Current default is `dapr` for Dapr Workflows
`workflowComponentName` | Use `dapr` for Dapr Workflows
`instanceId` | Unique value created for each run of a specific workflow
### HTTP response codes
@ -115,6 +229,10 @@ The API call will provide a JSON response similar to this:
}
```
Parameter | Description
--------- | -----------
`runtimeStatus` | The status of the workflow instance. Values include: `RUNNING`, `TERMINATED`, `PAUSED`
## Component format
A Dapr `workflow.yaml` component file has the following structure:

View File

@ -4,12 +4,13 @@ title: "PostgreSQL binding spec"
linkTitle: "PostgreSQL"
description: "Detailed documentation on the PostgreSQL binding component"
aliases:
- "/operations/components/setup-bindings/supported-bindings/postgresql/"
- "/operations/components/setup-bindings/supported-bindings/postgres/"
---
## Component format
To setup PostgreSQL binding create a component of type `bindings.postgres`. See [this guide]({{< ref "howto-bindings.md#1-create-a-binding" >}}) on how to create and apply a binding configuration.
To setup PostgreSQL binding create a component of type `bindings.postgresql`. See [this guide]({{< ref "howto-bindings.md#1-create-a-binding" >}}) on how to create and apply a binding configuration.
```yaml
@ -18,7 +19,7 @@ kind: Component
metadata:
name: <NAME>
spec:
type: bindings.postgres
type: bindings.postgresql
version: v1
metadata:
- name: url # Required
@ -33,7 +34,7 @@ The above example uses secrets as plain strings. It is recommended to use a secr
| Field | Required | Binding support | Details | Example |
|--------------------|:--------:|------------|-----|---------|
| url | Y | Output | Postgres connection string See [here](#url-format) for more details | `"user=dapr password=secret host=dapr.example.com port=5432 dbname=dapr sslmode=verify-ca"` |
| url | Y | Output | PostgreSQL connection string See [here](#url-format) for more details | `"user=dapr password=secret host=dapr.example.com port=5432 dbname=dapr sslmode=verify-ca"` |
### URL format
@ -144,8 +145,7 @@ Finally, the `close` operation can be used to explicitly close the DB connection
}
```
> Note, the PostgreSql binding itself doesn't prevent SQL injection, like with any database application, validate the input before executing query.
> Note, the PostgreSQL binding itself doesn't prevent SQL injection, like with any database application, validate the input before executing query.
## Related links

View File

@ -1,15 +1,16 @@
---
type: docs
title: "Postgres"
linkTitle: "Postgres"
description: Detailed information on the Postgres configuration store component
title: "PostgreSQL"
linkTitle: "PostgreSQL"
description: Detailed information on the PostgreSQL configuration store component
aliases:
- "/operations/components/setup-configuration-store/supported-configuration-stores/setup-postgresql/"
- "/operations/components/setup-configuration-store/supported-configuration-stores/setup-postgres/"
---
## Component format
To set up an Postgres configuration store, create a component of type `configuration.postgres`
To set up an PostgreSQL configuration store, create a component of type `configuration.postgresql`
```yaml
apiVersion: dapr.io/v1alpha1
@ -17,7 +18,7 @@ kind: Component
metadata:
name: <NAME>
spec:
type: configuration.postgres
type: configuration.postgresql
version: v1
metadata:
- name: connectionString
@ -40,10 +41,10 @@ The above example uses secrets as plain strings. It is recommended to use a secr
| connectionString | Y | The connection string for PostgreSQL. Default pool_max_conns = 5 | `"host=localhost user=postgres password=example port=5432 connect_timeout=10 database=dapr_test pool_max_conns=10"`
| table | Y | table name for configuration information. | `configTable`
## Set up Postgres as Configuration Store
## Set up PostgreSQL as Configuration Store
1. Start Postgres Database
1. Connect to the Postgres database and setup a configuration table with following schema -
1. Start PostgreSQL Database
1. Connect to the PostgreSQL database and setup a configuration table with following schema -
| Field | Datatype | Nullable |Details |
|--------------------|:--------:|---------|---------|
@ -101,13 +102,13 @@ AFTER INSERT OR UPDATE OR DELETE ON configTable
7. In the subscribe request add an additional metadata field with key as `pgNotifyChannel` and value should be set to same `channel name` mentioned in `pg_notify`. From the above example, it should be set to `config`
{{% alert title="Note" color="primary" %}}
When calling `subscribe` API, `metadata.pgNotifyChannel` should be used to specify the name of the channel to listen for notifications from Postgres configuration store.
When calling `subscribe` API, `metadata.pgNotifyChannel` should be used to specify the name of the channel to listen for notifications from PostgreSQL configuration store.
Any number of keys can be added to a subscription request. Each subscription uses an exclusive database connection. It is strongly recommended to subscribe to multiple keys within a single subscription. This helps optimize the number of connections to the database.
Example of subscribe HTTP API -
```ps
curl --location --request GET 'http://<host>:<dapr-http-port>/configuration/postgres/subscribe?key=<keyname1>&key=<keyname2>&metadata.pgNotifyChannel=<channel name>'
curl --location --request GET 'http://<host>:<dapr-http-port>/configuration/mypostgresql/subscribe?key=<keyname1>&key=<keyname2>&metadata.pgNotifyChannel=<channel name>'
```
{{% /alert %}}

View File

@ -11,7 +11,7 @@ This component allows using PostgreSQL (Postgres) as state store for Dapr.
## Create a Dapr component
Create a file called `postgres.yaml`, paste the following and replace the `<CONNECTION STRING>` value with your connection string. The connection string is a standard PostgreSQL connection string. For example, `"host=localhost user=postgres password=example port=5432 connect_timeout=10 database=dapr_test"`. See the PostgreSQL [documentation on database connections](https://www.postgresql.org/docs/current/libpq-connect.html) for information on how to define a connection string.
Create a file called `postgresql.yaml`, paste the following and replace the `<CONNECTION STRING>` value with your connection string. The connection string is a standard PostgreSQL connection string. For example, `"host=localhost user=postgres password=example port=5432 connect_timeout=10 database=dapr_test"`. See the PostgreSQL [documentation on database connections](https://www.postgresql.org/docs/current/libpq-connect.html) for information on how to define a connection string.
If you want to also configure PostgreSQL to store actors, add the `actorStateStore` option as in the example below.

View File

@ -70,8 +70,8 @@
features:
input: false
output: true
- component: PostgreSql
link: postgres
- component: PostgreSQL
link: postgresql
state: Stable
version: v1
since: "1.9"

View File

@ -3,8 +3,8 @@
state: Stable
version: v1
since: "1.11"
- component: Postgres
link: postgres-configuration-store
- component: PostgreSQL
link: postgresql-configuration-store
state: Stable
version: v1
since: "1.11"