mirror of https://github.com/dapr/docs.git
pass pt 1
Signed-off-by: Hannah Hunter <hannahhunter@microsoft.com>
This commit is contained in:
parent
54b1bda0f2
commit
b8efe4ad2e
|
|
@ -6,11 +6,17 @@ weight: 4000
|
|||
description: "Learn how to develop and author workflows"
|
||||
---
|
||||
|
||||
This article provides a high-level overview of how to author workflows that are executed by the Dapr Workflow engine. In particular, this article lists the SDKs available, supported authoring patterns, and introduces the various concepts you'll need to understand when building Dapr workflows.
|
||||
This article provides a high-level overview of how to author workflows that are executed by the Dapr Workflow engine.
|
||||
|
||||
{{% alert title="Note" color="primary" %}}
|
||||
If you haven't already, [try out the .NET SDK Workflow example](https://github.com/dapr/dotnet-sdk/tree/master/examples/Workflow) for a quick walk-through on how to use the Dapr Workflows.
|
||||
|
||||
{{% /alert %}}
|
||||
|
||||
|
||||
## Author workflows as code
|
||||
|
||||
Dapr workflow logic is implemented using general purpose programming languages, allowing you to:
|
||||
Dapr Workflow logic is implemented using general purpose programming languages, allowing you to:
|
||||
|
||||
- Use your preferred programming language (no need to learn a new DSL or YAML schema)
|
||||
- Have access to the language’s standard libraries
|
||||
|
|
@ -23,7 +29,13 @@ The Dapr sidecar doesn’t load any workflow definitions. Rather, the sidecar si
|
|||
|
||||
### Register the workflow
|
||||
|
||||
To start using the workflow building block, you simply write the workflow details directly into your application code. [In the following example](https://github.com/dapr/dotnet-sdk/blob/master/examples/Workflow/WorkflowWebApp/Program.cs), for a basic ASP.NET order processing application using the .NET SDK, your project code would include:
|
||||
To start using the workflow building block, simply write the workflow details directly into your application code.
|
||||
|
||||
{{< tabs ".NET" >}}
|
||||
|
||||
{{% codetab %}}
|
||||
|
||||
[In the following example](https://github.com/dapr/dotnet-sdk/blob/master/examples/Workflow/WorkflowWebApp/Program.cs), for a basic ASP.NET order processing application using the .NET SDK, your project code would include:
|
||||
|
||||
- A NuGet package called `Dapr.Workflow` to receive the .NET SDK capabilities
|
||||
- A builder with an extension method called `AddDaprWorkflow`
|
||||
|
|
@ -36,7 +48,7 @@ To start using the workflow building block, you simply write the workflow detail
|
|||
using Dapr.Workflow;
|
||||
//...
|
||||
|
||||
// Dapr workflows are registered as part of the service configuration
|
||||
// Dapr Workflows are registered as part of the service configuration
|
||||
builder.Services.AddDaprWorkflow(options =>
|
||||
{
|
||||
// Note that it's also possible to register a lambda function as the workflow
|
||||
|
|
@ -88,10 +100,19 @@ app.MapGet("/orders/{orderId}", async (string orderId, WorkflowEngineClient clie
|
|||
app.Run();
|
||||
```
|
||||
|
||||
{{% /codetab %}}
|
||||
|
||||
{{< /tabs >}}
|
||||
|
||||
|
||||
### Register the workflow activities
|
||||
|
||||
Next, you'll define the workflow activities you'd like your workflow to perform. Activities are a class definition and can take inputs and outputs. Activities also participate in dependency injection, like a class constructor to access the logger for ASP.NET or binding to a Dapr client.
|
||||
|
||||
{{< tabs ".NET" >}}
|
||||
|
||||
{{% codetab %}}
|
||||
|
||||
Continuing the ASP.NET order processing example, the `OrderProcessingWorkflow` class is derived from a base class called `Workflow` with input and output parameter types.
|
||||
|
||||
It also includes a `RunAsync` method that will do the heavy lifting of the workflow and call the workflow activities. The activities called in the example are:
|
||||
|
|
@ -130,6 +151,9 @@ It also includes a `RunAsync` method that will do the heavy lifting of the workf
|
|||
}
|
||||
```
|
||||
|
||||
{{% /codetab %}}
|
||||
|
||||
{{< /tabs >}}
|
||||
|
||||
{{% alert title="Important" color="primary" %}}
|
||||
Because of how replay-based workflows execute, you'll write most logic that does things like IO and interacting with systems **inside activities**. Meanwhile, **workflow method** is just for orchestrating those activities.
|
||||
|
|
@ -139,5 +163,9 @@ Because of how replay-based workflows execute, you'll write most logic that does
|
|||
|
||||
## Next steps
|
||||
|
||||
{{< button text="Manage workflows >>" page="howto-manage-workflow.md" >}}
|
||||
|
||||
## Related links
|
||||
- [Learn more about the Workflow API]({{< ref workflow-overview.md >}})
|
||||
- [Workflow API reference]({{< ref workflow_api.md >}})
|
||||
- [Workflow API reference]({{< ref workflow_api.md >}})
|
||||
- Learn more about [how to manage workflows with the .NET SDK](todo) and try out [the .NET example](https://github.com/dapr/dotnet-sdk/tree/master/examples/Workflow)
|
||||
|
|
|
|||
|
|
@ -6,21 +6,19 @@ weight: 5000
|
|||
description: Manage and expose workflows
|
||||
---
|
||||
|
||||
Now that you've read about [the Workflow building block]({{< ref workflow-overview >}}), let's dive into how
|
||||
Now that you've [set up the workflow and its activities in your application]({{< ref howto-author-workflow.md >}}), you can start, terminate, and get metadata status for the workflow using HTTP API calls. For more information, read the [workflow API reference]({{< ref workflow_api.md >}}).
|
||||
|
||||
{{% alert title="Note" color="primary" %}}
|
||||
If you haven't already, [try out the .NET SDK Workflow example](https://github.com/dapr/dotnet-sdk/tree/master/examples/Workflow) for a quick walk-through on how to use the service invocation API.
|
||||
If you haven't already, [try out the workflow quickstart](todo) for a quick walk-through on how to use workflows.
|
||||
|
||||
{{% /alert %}}
|
||||
|
||||
Now that you've [set up the workflow and its activities in your application]({{< ref howto-author-workflow.md >}}), you can start, terminate, and get metadata status for the workflow using HTTP API calls. For more information, read the [workflow API reference]({{< ref workflow_api.md >}}).
|
||||
|
||||
## Start workflow
|
||||
|
||||
To start your workflow, run:
|
||||
|
||||
```bash
|
||||
POST http://localhost:3500/v1.0-alpha1/workflows/<workflowComponentName>/<workflowName>/<instanceId>/start
|
||||
POST http://localhost:3500/v1.0-alpha1/workflows/<workflowComponent>/<workflowName>/<instanceId>/start
|
||||
```
|
||||
|
||||
## Terminate workflow
|
||||
|
|
@ -28,7 +26,7 @@ POST http://localhost:3500/v1.0-alpha1/workflows/<workflowComponentName>/<workfl
|
|||
To terminate your workflow, run:
|
||||
|
||||
```bash
|
||||
POST http://localhost:3500/v1.0-alpha1/workflows/<workflowComponentName>/<instanceId>/terminate
|
||||
POST http://localhost:3500/v1.0-alpha1/workflows/<workflowComponent>/<instanceId>/terminate
|
||||
```
|
||||
|
||||
### Get metadata for a workflow
|
||||
|
|
@ -36,11 +34,10 @@ POST http://localhost:3500/v1.0-alpha1/workflows/<workflowComponentName>/<instan
|
|||
To fetch workflow outputs and inputs, run:
|
||||
|
||||
```bash
|
||||
GET http://localhost:3500/v1.0-alpha1/workflows/<workflowComponentName>/<workflowName>/<instanceId>
|
||||
GET http://localhost:3500/v1.0-alpha1/workflows/<workflowComponent>/<workflowName>/<instanceId>
|
||||
```
|
||||
|
||||
## Next steps
|
||||
|
||||
- Learn more about [authoring workflows]({{< ref howto-author-workflow.md >}})
|
||||
- Learn more about [workflow architecture]({{< ref workflow-architecture.md >}}) and [workfow capabilities]({{< ref workflow-capabilities.md >}})
|
||||
- Learn more about [how to manage workflows with the .NET SDK](todo) and try out [the .NET example](https://github.com/dapr/dotnet-sdk/tree/master/examples/Workflow)
|
||||
|
||||
|
|
|
|||
|
|
@ -1,18 +1,18 @@
|
|||
---
|
||||
type: docs
|
||||
title: "Dapr workflow architecture"
|
||||
linkTitle: "Dapr workflow architecture"
|
||||
title: "Dapr Workflow architecture"
|
||||
linkTitle: "Dapr Workflow architecture"
|
||||
weight: 3000
|
||||
description: "Overview of the Dapr workflow engine architecture"
|
||||
description: "Overview of the Dapr Workflow engine architecture"
|
||||
---
|
||||
|
||||
# Overview
|
||||
[Dapr Workflows]({{< ref "workflow-overview.md" >}}) allow developers to define workflows using ordinary code in a variety of programming languages. The workflow engine runs inside of the Dapr sidecar and orchestrates workflow code deployed as part of your application. This article describes:
|
||||
|
||||
The Dapr workflow engine is a component that allows developers to define workflows using ordinary code in a variety of programming languages. The workflow engine runs inside of the Dapr sidecar and orchestrates workflow code that is deployed as part of your application. This article describes the architecture of the Dapr workflow engine, how it interacts with application code, and how it fits into the overall Dapr architecture.
|
||||
- The architecture of the Dapr Workflow engine
|
||||
- How the workflow engine interacts with application code
|
||||
- How the workflow engine fits into the overall Dapr architecture
|
||||
|
||||
{{% alert title="Note" color="primary" %}}
|
||||
For information on how to author workflows that are executed by the Dapr workflow engine, see the [workflow application developer guide]({{<ref "workflow-overview.md" >}}).
|
||||
{{% /alert %}}
|
||||
For more information on how to author Dapr Workflows in your application, see [How to: Author a workflow]({{< ref "workflow-overview.md" >}}).
|
||||
|
||||
The Dapr Workflow engine is internally implemented using then open source [durabletask-go](https://github.com/microsoft/durabletask-go) library, which is embedded directly into the Dapr sidecar. Dapr implements a custom durable task "backend" using internally managed actors, which manage workflow scale-out, persistence, and leader election. This article will go into more details in subsequent sections.
|
||||
|
||||
|
|
@ -65,7 +65,7 @@ Each workflow actor saves its state using the following keys in the configured s
|
|||
* `metadata`: Contains meta information about the workflow as a JSON blob and includes details such as the length of the inbox, the length of the history, and a 64-bit integer representing the workflow generation (for cases where the instance ID gets reused). The length information is used to determine which keys need to be read or written to when loading or saving workflow state updates.
|
||||
|
||||
{{% alert title="Warning" color="primary" %}}
|
||||
In the alpha release of the Dapr workflow engine, workflow actor state will remain in the state store even after a workflow has completed. Creating a large number of workflows could result in unbounded storage usage. In a future release, data retention policies will be introduced that can automatically purge the state store of old workflow state.
|
||||
In the alpha release of the Dapr Workflow engine, workflow actor state will remain in the state store even after a workflow has completed. Creating a large number of workflows could result in unbounded storage usage. In a future release, data retention policies will be introduced that can automatically purge the state store of old workflow state.
|
||||
{{% /alert %}}
|
||||
|
||||
The following diagram illustrates the typical lifecycle of a workflow actor.
|
||||
|
|
@ -83,7 +83,7 @@ Each activity actor stores a single key into the state store:
|
|||
* `activityreq-N`: The key contains the activity invocation payload, which includes the serialized activity input data. The `N` value is a 64-bit unsigned integer that represents the _generation_ of the workflow, a concept which is outside the scope of this documentation.
|
||||
|
||||
{{% alert title="Warning" color="primary" %}}
|
||||
In the alpha release of the Dapr workflow engine, activity actor state will remain in the state store even after the activity task has completed. Scheduling a large number of workflow activities could result in unbounded storage usage. In a future release, data retention policies will be introduced that can automatically purge the state store of completed activity state.
|
||||
In the alpha release of the Dapr Workflow engine, activity actor state will remain in the state store even after the activity task has completed. Scheduling a large number of workflow activities could result in unbounded storage usage. In a future release, data retention policies will be introduced that can automatically purge the state store of completed activity state.
|
||||
{{% /alert %}}
|
||||
|
||||
Activity actors are short-lived. They are activated when a workflow actor schedules an activity task and will immediately call into the workflow application to invoke the associated activity code. One the activity code has finished running and has returned its result, the activity actor will send a message to the parent workflow actor with the execution results, triggering the workflow to move forward to its next step.
|
||||
|
|
@ -94,17 +94,17 @@ Activity actors are short-lived. They are activated when a workflow actor schedu
|
|||
|
||||
TODO: Describe how reminders are used, and what kinds of reminder pressure may be added to a system.
|
||||
|
||||
The Dapr workflow engine ensures workflow fault-tolerance by using actor reminders to recover from transient system failures. Prior to invoking application workflow code, the workflow or activity actor will create a new reminder. If the application code executes without interruption, the reminder is deleted. However, if the node or the sidecar hosting the associated workflow or activity crashes, the reminder will reactivate the corresponding actor and the execution will be retried.
|
||||
The Dapr Workflow engine ensures workflow fault-tolerance by using actor reminders to recover from transient system failures. Prior to invoking application workflow code, the workflow or activity actor will create a new reminder. If the application code executes without interruption, the reminder is deleted. However, if the node or the sidecar hosting the associated workflow or activity crashes, the reminder will reactivate the corresponding actor and the execution will be retried.
|
||||
|
||||
TODO: Diagrams showing the process of invoking workflow and activity actors
|
||||
|
||||
{{% alert title="Important" color="warning" %}}
|
||||
Too many active reminders in a cluster may result in performance issues. If your application is already using actors and reminders heavily, be mindful of the additional load that Dapr workflows may add to your system.
|
||||
Too many active reminders in a cluster may result in performance issues. If your application is already using actors and reminders heavily, be mindful of the additional load that Dapr Workflows may add to your system.
|
||||
{{% /alert %}}
|
||||
|
||||
### State store usage
|
||||
|
||||
Dapr workflows use actors internally to drive the execution of workflows. Like any actors, these internal workflow actors store their state in the configured state store. Any state store that supports actors implicitly supports Dapr workflow.
|
||||
Dapr Workflows use actors internally to drive the execution of workflows. Like any actors, these internal workflow actors store their state in the configured state store. Any state store that supports actors implicitly supports Dapr Workflow.
|
||||
|
||||
As discussed in the [workflow actors]({{< ref workflow-architecture.md >}}) section, workflows save their state incrementally by appending to a history log. The history log for a workflow is distributed across multiple state store keys so that each "checkpoint" only needs to append the newest entries.
|
||||
|
||||
|
|
@ -116,7 +116,7 @@ Different state store implementations may implicitly put restrictions on the typ
|
|||
|
||||
## Workflow scalability
|
||||
|
||||
Because Dapr workflows are internally implemented using actors, Dapr workflows have the same scalability characteristics as actors. The placement service doesn't distinguish workflow actors and actors you define in your application and will load balance workflows using the same algorithms that it uses for actors.
|
||||
Because Dapr Workflows are internally implemented using actors, Dapr Workflows have the same scalability characteristics as actors. The placement service doesn't distinguish workflow actors and actors you define in your application and will load balance workflows using the same algorithms that it uses for actors.
|
||||
|
||||
The expected scalability of a workflow is determined by the following factors:
|
||||
|
||||
|
|
@ -130,16 +130,16 @@ The implementation details of the workflow code in the target application also p
|
|||
TODO: Diagram showing an example distribution of workflows, child-workflows, and activity tasks.
|
||||
|
||||
{{% alert title="Important" color="warning" %}}
|
||||
At the time of writing, there are no global limits imposed on workflow and activity concurrency. A runaway workflow could therefore potentially consume all resources in a cluster if it attempts to schedule too many tasks in parallel. Developers should use care when authoring Dapr workflows that schedule large batches of work in parallel.
|
||||
At the time of writing, there are no global limits imposed on workflow and activity concurrency. A runaway workflow could therefore potentially consume all resources in a cluster if it attempts to schedule too many tasks in parallel. Developers should use care when authoring Dapr Workflows that schedule large batches of work in parallel.
|
||||
|
||||
It's also worth noting that the Dapr workflow engine requires that all instances of each workflow app register the exact same set of workflows and activities. In other words, it's not possible to scale certain workflows or activities independently. All workflows and activities within an app must be scaled together.
|
||||
It's also worth noting that the Dapr Workflow engine requires that all instances of each workflow app register the exact same set of workflows and activities. In other words, it's not possible to scale certain workflows or activities independently. All workflows and activities within an app must be scaled together.
|
||||
{{% /alert %}}
|
||||
|
||||
Workflows don't control the specifics of how load is distributed across the cluster. For example, if a workflow schedules 10 activity tasks to run in parallel, all 10 tasks may run on as many as 10 different compute nodes or as few as a single compute node. The actual scale behavior is determined by the actor placement service, which manages the distribution of the actors that represent each of the workflow's tasks.
|
||||
|
||||
## Workflow latency
|
||||
|
||||
In order to provide guarantees around durability and resiliency, Dapr workflows frequently write to the state store and rely on reminders to drive execution. Dapr workflows therefore may not be appropriate for latency-sensitive workloads. Expected sources of high latency include:
|
||||
In order to provide guarantees around durability and resiliency, Dapr Workflows frequently write to the state store and rely on reminders to drive execution. Dapr Workflows therefore may not be appropriate for latency-sensitive workloads. Expected sources of high latency include:
|
||||
|
||||
* Latency from the state store when persisting workflow state.
|
||||
* Latency from the state store when rehydrating workflows with large histories.
|
||||
|
|
@ -150,3 +150,9 @@ See the [Reminder usage and execution guarantees]({{< ref workflow-architecture.
|
|||
|
||||
## Next steps
|
||||
|
||||
{{< button text="Author workflows >>" page="howto-author-workflow.md" >}}
|
||||
|
||||
## Related links
|
||||
- [Learn more about the Workflow API]({{< ref workflow-overview.md >}})
|
||||
- [Workflow API reference]({{< ref workflow_api.md >}})
|
||||
- Learn more about [how to manage workflows with the .NET SDK](todo) and try out [the .NET example](https://github.com/dapr/dotnet-sdk/tree/master/examples/Workflow)
|
||||
|
|
|
|||
|
|
@ -10,7 +10,7 @@ Now that you've learned about the [workflow building block]({{< ref workflow-ove
|
|||
|
||||
## Workflows
|
||||
|
||||
Dapr workflows are functions you write that define a series of steps or tasks to be executed in a particular order. The Dapr workflow engine takes care of coordinating and managing the execution of the steps, including managing failures and retries. If the app hosting your workflows is scaled out across multiple machines, the workflow engine may also load balance the execution of workflows and their tasks across multiple machines.
|
||||
Dapr Workflows are functions you write that define a series of steps or tasks to be executed in a particular order. The Dapr Workflow engine takes care of coordinating and managing the execution of the steps, including managing failures and retries. If the app hosting your workflows is scaled out across multiple machines, the workflow engine may also load balance the execution of workflows and their tasks across multiple machines.
|
||||
|
||||
There are several different kinds of tasks that a workflow can schedule, including [activities]({{< ref "workflow-capabilities.md#workflow-activities" >}}) for executing custom logic, [durable timers]({{< ref "workflow-capabilities.md#durable-timers" >}}) for putting the workflow to sleep for arbitrary lengths of time, [child workflows]({{< ref "workflow-capabilities.md#child-workflows" >}}) for breaking larger workflows into smaller pieces, and [external event waiters]({{< ref "workflow-capabilities.md#external-events" >}}) for blocking workflows until they receive external event signals. These tasks are described in more details in their corresponding sections.
|
||||
|
||||
|
|
@ -22,7 +22,7 @@ Only one workflow instance with a given ID can exist at any given time. However,
|
|||
|
||||
### Workflow replay
|
||||
|
||||
Dapr workflows maintain their execution state by using a technique known as [event sourcing](https://learn.microsoft.com/azure/architecture/patterns/event-sourcing). Instead of directly storing the current state of a workflow as a snapshot, the workflow engine manages an append-only log of history events that describe the various steps that a workflow has taken. When using the workflow authoring SDK, the storing of these history events happens automatically whenever the workflow "awaits" for the result of a scheduled task.
|
||||
Dapr Workflows maintain their execution state by using a technique known as [event sourcing](https://learn.microsoft.com/azure/architecture/patterns/event-sourcing). Instead of directly storing the current state of a workflow as a snapshot, the workflow engine manages an append-only log of history events that describe the various steps that a workflow has taken. When using the workflow authoring SDK, the storing of these history events happens automatically whenever the workflow "awaits" for the result of a scheduled task.
|
||||
|
||||
{{% alert title="Note" color="primary" %}}
|
||||
For more information on how workflow state is managed, see the [workflow architecture guide]({{< ref workflow-architecture.md >}}).
|
||||
|
|
@ -30,7 +30,7 @@ For more information on how workflow state is managed, see the [workflow archite
|
|||
|
||||
When a workflow "awaits" a scheduled task, it may unload itself from memory until the task completes. Once the task completes, the workflow engine will schedule the workflow function to run again. This second execution of the workflow function is known as a _replay_. When a workflow function is replayed, it runs again from the beginning. However, when it encounters a task that it already scheduled, instead of scheduling that task again, the workflow engine will return the result of the scheduled task to the workflow and continue execution until the next "await" point. This "replay" behavior continues until the workflow function completes or fails with an error.
|
||||
|
||||
Using this replay technique, a workflow is able to resume execution from any "await" point as if it had never been unloaded from memory. Even the values of local variables from previous runs can be restored without the workflow engine knowing anything about what data they stored. This ability to restore state is also what makes Dapr workflows _durable_ and fault tolerant.
|
||||
Using this replay technique, a workflow is able to resume execution from any "await" point as if it had never been unloaded from memory. Even the values of local variables from previous runs can be restored without the workflow engine knowing anything about what data they stored. This ability to restore state is also what makes Dapr Workflows _durable_ and fault tolerant.
|
||||
|
||||
### Workflow determinism and code constraints
|
||||
|
||||
|
|
@ -78,7 +78,7 @@ Workflow activities are the basic unit of work in a workflow and are the tasks t
|
|||
|
||||
Unlike workflows, activities aren't restricted in the type of work you can do in them. Activities are frequently used to make network calls or run CPU intensive operations. An activity can also return data back to the workflow.
|
||||
|
||||
The Dapr workflow engine guarantees that each called activity will be executed **at least once** as part of a workflow's execution. Because activities only guarantee at-least-once execution, it's recommended that activity logic be implemented as idempotent whenever possible.
|
||||
The Dapr Workflow engine guarantees that each called activity will be executed **at least once** as part of a workflow's execution. Because activities only guarantee at-least-once execution, it's recommended that activity logic be implemented as idempotent whenever possible.
|
||||
|
||||
## Child workflows
|
||||
|
||||
|
|
@ -98,7 +98,7 @@ Because child workflows are independent of their parents, terminating a parent w
|
|||
|
||||
## Durable timers
|
||||
|
||||
Dapr workflows allow you to schedule reminder-like durable delays for any time range, including minutes, days, or even years. These _durable timers_ can be scheduled by workflows to implement simple delays or to set up ad-hoc timeouts on other async tasks. More specifically, a durable timer can be set to trigger on a particular date or after a specified duration. There are no limits to the maximum duration of durable timers, which are internally backed by internal actor reminders. For example, a workflow that tracks a 30-day free subscription to a service could be implemented using a durable timer that fires 30-days after the workflow is created. Workflows can be safely unloaded from memory while waiting for a durable timer to fire.
|
||||
Dapr Workflows allow you to schedule reminder-like durable delays for any time range, including minutes, days, or even years. These _durable timers_ can be scheduled by workflows to implement simple delays or to set up ad-hoc timeouts on other async tasks. More specifically, a durable timer can be set to trigger on a particular date or after a specified duration. There are no limits to the maximum duration of durable timers, which are internally backed by internal actor reminders. For example, a workflow that tracks a 30-day free subscription to a service could be implemented using a durable timer that fires 30-days after the workflow is created. Workflows can be safely unloaded from memory while waiting for a durable timer to fire.
|
||||
|
||||
{{% alert title="Note" color="primary" %}}
|
||||
Some APIs in the workflow authoring SDK may internally schedule durable timers to implement internal timeout behavior.
|
||||
|
|
@ -118,6 +118,11 @@ Workflows can also wait for multiple external event signals of the same name, in
|
|||
|
||||
## Next steps
|
||||
|
||||
{{< button text="Workflow architecture >>" page="workflow-architecture.md" >}}
|
||||
|
||||
## Related links
|
||||
|
||||
- [Try out Dapr Workflows using the quickstart](todo)
|
||||
- [Learn how to author a workflow]({{< ref howto-author-workflow.md >}})
|
||||
- [Learn how to manage workflows]({{< ref howto-manage-workflow.md >}})
|
||||
- [Learn more about the Workflow API]({{< ref workflow-overview.md >}})
|
||||
- [Workflow API reference]({{< ref workflow_api.md >}})
|
||||
- Learn more about [how to manage workflows with the .NET SDK](todo) and try out [the .NET example](https://github.com/dapr/dotnet-sdk/tree/master/examples/Workflow)
|
||||
|
|
|
|||
|
|
@ -216,5 +216,10 @@ Watch [this video for an overview on Dapr Workflows](https://youtu.be/s1p9MNl4VG
|
|||
<iframe width="560" height="315" src="https://www.youtube-nocookie.com/embed/s1p9MNl4VGo?start=131" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>
|
||||
|
||||
## Next steps
|
||||
- Learn more about [authoring workflows for the built-in engine component]()
|
||||
- Learn more about [supported workflow components]()
|
||||
|
||||
{{< button text="Workflow features and concepts >>" page="workflow-capabilities.md" >}}
|
||||
|
||||
## Related links
|
||||
|
||||
- [Workflow API reference]({{< ref workflow_api.md >}})
|
||||
- Learn more about [how to manage workflows with the .NET SDK](todo) and try out [the .NET example](https://github.com/dapr/dotnet-sdk/tree/master/examples/Workflow)
|
||||
|
|
|
|||
|
|
@ -46,7 +46,7 @@ GET http://localhost:3500/v1.0-alpha1/workflows/<workflowComponentName>/<workflo
|
|||
|
||||
Parameter | Description
|
||||
--------- | -----------
|
||||
`workflowComponentName` | One of the [supported workflow components][]
|
||||
`workflowComponentName` | Current default is `dapr` for Dapr Workflows
|
||||
`workflowName` | Identify the workflow type
|
||||
`instanceId` | Unique value created for each run of a specific workflow
|
||||
|
||||
|
|
|
|||
Loading…
Reference in New Issue