Merge branch 'v1.11' into 1-11_ENDGAME

This commit is contained in:
Hannah Hunter 2023-06-08 13:18:40 -04:00 committed by GitHub
commit bc6a61a5b7
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
31 changed files with 753 additions and 97 deletions

View File

@ -15,14 +15,54 @@ Now that you've read about [Cryptography as a Dapr building block]({{< ref crypt
## Encrypt
Using the Dapr gRPC APIs in your project, you can encrypt a stream of data, such as a file.
{{< tabs "JavaScript" "Go" >}}
{{< tabs "Go" >}}
{{% codetab %}}
<!--JavaScript-->
Using the Dapr SDK in your project, with the gRPC APIs, you can encrypt data in a buffer or a string:
```js
// When passing data (a buffer or string), `encrypt` returns a Buffer with the encrypted message
const ciphertext = await client.crypto.encrypt(plaintext, {
// Name of the Dapr component (required)
componentName: "mycryptocomponent",
// Name of the key stored in the component (required)
keyName: "mykey",
// Algorithm used for wrapping the key, which must be supported by the key named above.
// Options include: "RSA", "AES"
keyWrapAlgorithm: "RSA",
});
```
The APIs can also be used with streams, to encrypt data more efficiently when it comes from a stream. The example below encrypts a file, writing to another file, using streams:
```js
// `encrypt` can be used as a Duplex stream
await pipeline(
fs.createReadStream("plaintext.txt"),
await client.crypto.encrypt({
// Name of the Dapr component (required)
componentName: "mycryptocomponent",
// Name of the key stored in the component (required)
keyName: "mykey",
// Algorithm used for wrapping the key, which must be supported by the key named above.
// Options include: "RSA", "AES"
keyWrapAlgorithm: "RSA",
}),
fs.createWriteStream("ciphertext.out"),
);
```
{{% /codetab %}}
{{% codetab %}}
<!--go-->
Using the Dapr SDK in your project, you can encrypt a stream of data, such as a file.
```go
out, err := sdkClient.Encrypt(context.Background(), rf, dapr.EncryptOptions{
// Name of the Dapr component (required)
@ -35,18 +75,8 @@ out, err := sdkClient.Encrypt(context.Background(), rf, dapr.EncryptOptions{
})
```
{{% /codetab %}}
{{< /tabs >}}
The following example puts the `Encrypt` API in context, with code that reads the file, encrypts it, then stores the result in another file.
{{< tabs "Go" >}}
{{% codetab %}}
<!--go-->
```go
// Input file, clear-text
rf, err := os.Open("input")
@ -81,18 +111,8 @@ if err != nil {
fmt.Println("Written", n, "bytes")
```
{{% /codetab %}}
{{< /tabs >}}
The following example uses the `Encrypt` API to encrypt a string.
{{< tabs "Go" >}}
{{% codetab %}}
<!--go-->
```go
// Input string
rf := strings.NewReader("Amor, cha nullo amato amar perdona, mi prese del costui piacer sì forte, che, come vedi, ancor non mabbandona")
@ -121,15 +141,41 @@ if err != nil {
## Decrypt
To decrypt a file, add the `Decrypt` gRPC API to your project.
{{< tabs "JavaScript" "Go" >}}
{{< tabs "Go" >}}
{{% codetab %}}
<!--JavaScript-->
Using the Dapr SDK, you can decrypt data in a buffer or using streams.
```js
// When passing data as a buffer, `decrypt` returns a Buffer with the decrypted message
const plaintext = await client.crypto.decrypt(ciphertext, {
// Only required option is the component name
componentName: "mycryptocomponent",
});
// `decrypt` can also be used as a Duplex stream
await pipeline(
fs.createReadStream("ciphertext.out"),
await client.crypto.decrypt({
// Only required option is the component name
componentName: "mycryptocomponent",
}),
fs.createWriteStream("plaintext.out"),
);
```
{{% /codetab %}}
{{% codetab %}}
<!--go-->
In the following example, `out` is a stream that can be written to file or read in memory, as in the examples above.
To decrypt a file, use the `Decrypt` gRPC API to your project.
In the following example, `out` is a stream that can be written to file or read in memory, as in the examples above.
```go
out, err := sdkClient.Decrypt(context.Background(), rf, dapr.EncryptOptions{

View File

@ -186,7 +186,7 @@ Place `subscription.yaml` in the same directory as your `pubsub.yaml` component.
Below are code examples that leverage Dapr SDKs to subscribe to the topic you defined in `subscription.yaml`.
{{< tabs Dotnet Java Python Go Javascript>}}
{{< tabs Dotnet Java Python Go JavaScript>}}
{{% codetab %}}

View File

@ -30,10 +30,12 @@ The Dapr sidecar doesnt load any workflow definitions. Rather, the sidecar si
[Workflow activities]({{< ref "workflow-features-concepts.md#workflow-activites" >}}) are the basic unit of work in a workflow and are the tasks that get orchestrated in the business process.
{{< tabs ".NET" >}}
{{< tabs ".NET" Python >}}
{{% codetab %}}
<!--csharp-->
Define the workflow activities you'd like your workflow to perform. Activities are a class definition and can take inputs and outputs. Activities also participate in dependency injection, like binding to a Dapr client.
The activities called in the example below are:
@ -96,6 +98,24 @@ public class ProcessPaymentActivity : WorkflowActivity<PaymentRequest, object>
[See the full `ProcessPaymentActivity.cs` workflow activity example.](https://github.com/dapr/dotnet-sdk/blob/master/examples/Workflow/WorkflowConsoleApp/Activities/ProcessPaymentActivity.cs)
{{% /codetab %}}
{{% codetab %}}
<!--python-->
Define the workflow activities you'd like your workflow to perform. Activities are a function definition and can take inputs and outputs. The following example creates a counter (activity) called `hello_act` that notifies users of the current counter value. `hello_act` is a function derived from a class called `WorkflowActivityContext`.
```python
def hello_act(ctx: WorkflowActivityContext, input):
global counter
counter += input
print(f'New counter value is: {counter}!', flush=True)
```
[See the `hello_act` workflow activity in context.](https://github.com/dapr/python-sdk/blob/master/examples/demo_workflow/app.py#LL40C1-L43C59)
{{% /codetab %}}
{{< /tabs >}}
@ -104,10 +124,12 @@ public class ProcessPaymentActivity : WorkflowActivity<PaymentRequest, object>
Next, register and call the activites in a workflow.
{{< tabs ".NET" >}}
{{< tabs ".NET" Python >}}
{{% codetab %}}
<!--csharp-->
The `OrderProcessingWorkflow` class is derived from a base class called `Workflow` with input and output parameter types. It also includes a `RunAsync` method that does the heavy lifting of the workflow and calls the workflow activities.
```csharp
@ -144,6 +166,28 @@ The `OrderProcessingWorkflow` class is derived from a base class called `Workflo
[See the full workflow example in `OrderProcessingWorkflow.cs`.](https://github.com/dapr/dotnet-sdk/blob/master/examples/Workflow/WorkflowConsoleApp/Workflows/OrderProcessingWorkflow.cs)
{{% /codetab %}}
{{% codetab %}}
<!--python-->
The `hello_world_wf` function is derived from a class called `DaprWorkflowContext` with input and output parameter types. It also includes a `yield` statement that does the heavy lifting of the workflow and calls the workflow activities.
```python
def hello_world_wf(ctx: DaprWorkflowContext, input):
print(f'{input}')
yield ctx.call_activity(hello_act, input=1)
yield ctx.call_activity(hello_act, input=10)
yield ctx.wait_for_external_event("event1")
yield ctx.call_activity(hello_act, input=100)
yield ctx.call_activity(hello_act, input=1000)
```
[See the `hello_world_wf` workflow in context.](https://github.com/dapr/python-sdk/blob/master/examples/demo_workflow/app.py#LL32C1-L38C51)
{{% /codetab %}}
{{< /tabs >}}
@ -152,10 +196,12 @@ The `OrderProcessingWorkflow` class is derived from a base class called `Workflo
Finally, compose the application using the workflow.
{{< tabs ".NET" >}}
{{< tabs ".NET" Python >}}
{{% codetab %}}
<!--csharp-->
[In the following `Program.cs` example](https://github.com/dapr/dotnet-sdk/blob/master/examples/Workflow/WorkflowConsoleApp/Program.cs), for a basic ASP.NET order processing application using the .NET SDK, your project code would include:
- A NuGet package called `Dapr.Workflow` to receive the .NET SDK capabilities
@ -223,8 +269,97 @@ app.Run();
{{% /codetab %}}
{{< /tabs >}}
{{% codetab %}}
<!--python-->
[In the following example](https://github.com/dapr/python-sdk/blob/master/examples/demo_workflow/app.py), for a basic Python hello world application using the Python SDK, your project code would include:
- A Python package called `DaprClient` to receive the Python SDK capabilities.
- A builder with extensions called:
- `WorkflowRuntime`: Allows you to register workflows and workflow activities
- `DaprWorkflowContext`: Allows you to [create workflows]({{< ref "#write-the-workflow" >}})
- `WorkflowActivityContext`: Allows you to [create workflow activities]({{< ref "#write-the-workflow-activities" >}})
- API calls. In the example below, these calls start, pause, resume, purge, and terminate the workflow.
```python
from dapr.ext.workflow import WorkflowRuntime, DaprWorkflowContext, WorkflowActivityContext
from dapr.clients import DaprClient
# ...
def main():
with DaprClient() as d:
host = settings.DAPR_RUNTIME_HOST
port = settings.DAPR_GRPC_PORT
workflowRuntime = WorkflowRuntime(host, port)
workflowRuntime = WorkflowRuntime()
workflowRuntime.register_workflow(hello_world_wf)
workflowRuntime.register_activity(hello_act)
workflowRuntime.start()
# Start workflow
print("==========Start Counter Increase as per Input:==========")
start_resp = d.start_workflow(instance_id=instanceId, workflow_component=workflowComponent,
workflow_name=workflowName, input=inputData, workflow_options=workflowOptions)
print(f"start_resp {start_resp.instance_id}")
# ...
# Pause workflow
d.pause_workflow(instance_id=instanceId, workflow_component=workflowComponent)
getResponse = d.get_workflow(instance_id=instanceId, workflow_component=workflowComponent)
print(f"Get response from {workflowName} after pause call: {getResponse.runtime_status}")
# Resume workflow
d.resume_workflow(instance_id=instanceId, workflow_component=workflowComponent)
getResponse = d.get_workflow(instance_id=instanceId, workflow_component=workflowComponent)
print(f"Get response from {workflowName} after resume call: {getResponse.runtime_status}")
sleep(1)
# Raise workflow
d.raise_workflow_event(instance_id=instanceId, workflow_component=workflowComponent,
event_name=eventName, event_data=eventData)
sleep(5)
# Purge workflow
d.purge_workflow(instance_id=instanceId, workflow_component=workflowComponent)
try:
getResponse = d.get_workflow(instance_id=instanceId, workflow_component=workflowComponent)
except DaprInternalError as err:
if nonExistentIDError in err._message:
print("Instance Successfully Purged")
# Kick off another workflow for termination purposes
start_resp = d.start_workflow(instance_id=instanceId, workflow_component=workflowComponent,
workflow_name=workflowName, input=inputData, workflow_options=workflowOptions)
print(f"start_resp {start_resp.instance_id}")
# Terminate workflow
d.terminate_workflow(instance_id=instanceId, workflow_component=workflowComponent)
sleep(1)
getResponse = d.get_workflow(instance_id=instanceId, workflow_component=workflowComponent)
print(f"Get response from {workflowName} after terminate call: {getResponse.runtime_status}")
# Purge workflow
d.purge_workflow(instance_id=instanceId, workflow_component=workflowComponent)
try:
getResponse = d.get_workflow(instance_id=instanceId, workflow_component=workflowComponent)
except DaprInternalError as err:
if nonExistentIDError in err._message:
print("Instance Successfully Purged")
workflowRuntime.shutdown()
if __name__ == '__main__':
main()
```
{{% /codetab %}}
{{< /tabs >}}
{{% alert title="Important" color="warning" %}}
@ -241,4 +376,6 @@ Now that you've authored a workflow, learn how to manage it.
## Related links
- [Workflow overview]({{< ref workflow-overview.md >}})
- [Workflow API reference]({{< ref workflow_api.md >}})
- [Try out the .NET example](https://github.com/dapr/dotnet-sdk/tree/master/examples/Workflow)
- Try out the full SDK examples:
- [.NET example](https://github.com/dapr/dotnet-sdk/tree/master/examples/Workflow)
- [Python example](https://github.com/dapr/python-sdk/tree/master/examples/demo_workflow)

View File

@ -8,12 +8,12 @@ description: Manage and run workflows
Now that you've [authored the workflow and its activities in your application]({{< ref howto-author-workflow.md >}}), you can start, terminate, and get information about the workflow using HTTP API calls. For more information, read the [workflow API reference]({{< ref workflow_api.md >}}).
{{< tabs ".NET SDK" HTTP >}}
{{< tabs ".NET" Python HTTP >}}
<!--NET-->
{{% codetab %}}
Manage your workflow within your code. In the `OrderProcessingWorkflow` example from the [Author a workflow]({{< ref "howto-author-workflow.md#write-the-workflow" >}}) guide, the workflow is registered in the code. You can now start, terminate, and get information about a running workflow:
Manage your workflow within your code. In the `OrderProcessingWorkflow` example from the [Author a workflow]({{< ref "howto-author-workflow.md#write-the-application" >}}) guide, the workflow is registered in the code. You can now start, terminate, and get information about a running workflow:
```csharp
string orderId = "exampleOrderId";
@ -46,6 +46,56 @@ await daprClient.PurgeWorkflowAsync(orderId, workflowComponent);
{{% /codetab %}}
<!--Python-->
{{% codetab %}}
Manage your workflow within your code. In the workflow example from the [Author a workflow]({{< ref "howto-author-workflow.md#write-the-application" >}}) guide, the workflow is registered in the code using the following APIs:
- **start_workflow**: Start an instance of a workflow
- **get_workflow**: Get information on the status of the workflow
- **pause_workflow**: Pauses or suspends a workflow instance that can later be resumed
- **resume_workflow**: Resumes a paused workflow instance
- **raise_workflow_event**: Raise an event on a workflow
- **purge_workflow**: Removes all metadata related to a specific workflow instance
- **terminate_workflow**: Terminate or stop a particular instance of a workflow
```python
from dapr.ext.workflow import WorkflowRuntime, DaprWorkflowContext, WorkflowActivityContext
from dapr.clients import DaprClient
# Sane parameters
instanceId = "exampleInstanceID"
workflowComponent = "dapr"
workflowName = "hello_world_wf"
eventName = "event1"
eventData = "eventData"
# Start the workflow
start_resp = d.start_workflow(instance_id=instanceId, workflow_component=workflowComponent,
workflow_name=workflowName, input=inputData, workflow_options=workflowOptions)
# Get info on the workflow
getResponse = d.get_workflow(instance_id=instanceId, workflow_component=workflowComponent)
# Pause the workflow
d.pause_workflow(instance_id=instanceId, workflow_component=workflowComponent)
# Resume the workflow
d.resume_workflow(instance_id=instanceId, workflow_component=workflowComponent)
# Raise an event on the workflow.
d.raise_workflow_event(instance_id=instanceId, workflow_component=workflowComponent,
event_name=eventName, event_data=eventData)
# Purge the workflow
d.purge_workflow(instance_id=instanceId, workflow_component=workflowComponent)
# Terminate the workflow
d.terminate_workflow(instance_id=instanceId, workflow_component=workflowComponent)
```
{{% /codetab %}}
<!--HTTP-->
{{% codetab %}}
@ -121,5 +171,7 @@ Learn more about these HTTP calls in the [workflow API reference guide]({{< ref
## Next steps
- [Try out the Workflow quickstart]({{< ref workflow-quickstart.md >}})
- [Try out the .NET example](https://github.com/dapr/dotnet-sdk/tree/master/examples/Workflow)
- Try out the full SDK examples:
- [.NET example](https://github.com/dapr/dotnet-sdk/tree/master/examples/Workflow)
- [Python example](https://github.com/dapr/python-sdk/blob/master/examples/demo_workflow/app.py)
- [Workflow API reference]({{< ref workflow_api.md >}})

View File

@ -14,7 +14,7 @@ For more information on how workflow state is managed, see the [workflow archite
## Workflows
Dapr Workflows are functions you write that define a series of steps or tasks to be executed in a particular order. The Dapr Workflow engine takes care of coordinating and managing the execution of the steps, including managing failures and retries. If the app hosting your workflows is scaled out across multiple machines, the workflow engine may also load balance the execution of workflows and their tasks across multiple machines.
Dapr Workflows are functions you write that define a series of tasks to be executed in a particular order. The Dapr Workflow engine takes care of scheduling and execution of the tasks, including managing failures and retries. If the app hosting your workflows is scaled out across multiple machines, the workflow engine may also load balance the execution of workflows and their tasks across multiple machines.
There are several different kinds of tasks that a workflow can schedule, including
- [Activities]({{< ref "workflow-features-concepts.md#workflow-activities" >}}) for executing custom logic
@ -24,7 +24,7 @@ There are several different kinds of tasks that a workflow can schedule, includi
### Workflow identity
Each workflow you define has a type name, and individual executions of a workflow have a unique _instance ID_. Workflow instance IDs can be generated by your app code, which is useful when workflows correspond to business entities like documents or jobs, or can be auto-generated UUIDs. A workflow's instance ID is useful for debugging and also for managing workflows using the [Workflow APIs]({{< ref workflow_api.md >}}).
Each workflow you define has a type name, and individual executions of a workflow require a unique _instance ID_. Workflow instance IDs can be generated by your app code, which is useful when workflows correspond to business entities like documents or jobs, or can be auto-generated UUIDs. A workflow's instance ID is useful for debugging and also for managing workflows using the [Workflow APIs]({{< ref workflow_api.md >}}).
Only one workflow instance with a given ID can exist at any given time. However, if a workflow instance completes or fails, its ID can be reused by a new workflow instance. Note, however, that the new workflow instance effectively replaces the old one in the configured state store.
@ -36,8 +36,8 @@ When a workflow "awaits" a scheduled task, it unloads itself from memory until t
When a workflow function is replayed, it runs again from the beginning. However, when it encounters a task that already completed, instead of scheduling that task again, the workflow engine:
1. Returns the result of the completed task to the workflow.
1. Continues execution until the next "await" point.
1. Returns the stored result of the completed task to the workflow.
1. Continues execution until the next "await" point.
This "replay" behavior continues until the workflow function completes or fails with an error.

View File

@ -10,32 +10,32 @@ description: "Overview of Dapr Workflow"
Dapr Workflow is currently in alpha.
{{% /alert %}}
Dapr Workflow makes orchestrating the logic required for messaging, state management, and failure handling across various microservices easier for developers. Dapr Workflow enables you to create long running, fault-tolerant, stateful applications. Prior to Dapr Workflow, you'd often need to build ad-hoc workflows in custom, complex code in order to achieve long running, fault-tolerant, stateful applications.
Dapr workflow makes it easy for developers to write business logic and integrations in a reliable way. Since Dapr workflows are stateful, they support long-running and fault-tolerant applications, ideal for orchestrating microservices. Dapr workflow works seamlessly with other Dapr building blocks, such as service invocation, pub/sub, state management, and bindings.
The durable, resilient Dapr Workflow capability:
- Offers a built-in workflow runtime for driving Dapr Workflow execution
- Provides SDKs for authoring workflows in code, using any language
- Provides HTTP and gRPC APIs for managing workflows (start, query, suspend/resume, terminate)
- Integrates with any other workflow runtime via workflow components
- Offers a built-in workflow runtime for driving Dapr Workflow execution.
- Provides SDKs for authoring workflows in code, using any language.
- Provides HTTP and gRPC APIs for managing workflows (start, query, suspend/resume, terminate).
- Integrates with any other workflow runtime via workflow components.
<img src="/images/workflow-overview/workflow-overview.png" width=800 alt="Diagram showing basics of Dapr Workflow">
Some example scenarios that Dapr Workflow can perform are:
- Order processing involving inventory management, payment systems, shipping, etc.
- Order processing involving orchestration between inventory management, payment systems, and shipping services.
- HR onboarding workflows coordinating tasks across multiple departments and participants.
- Orchestrating the roll-out of digital menu updates in a national restaurant chain.
- Image processing workflows involving API-based classification and storage.
## Features
### Workflows and activities
With Dapr Workflow, you can write activites and then compose those activities together into a workflow. Workflow activities are:
With Dapr Workflow, you can write activities and then orchestrate those activities in a workflow. Workflow activities are:
- The basic unit of work in a workflow
- The tasks that get orchestrated in the business process
- Used for calling other (Dapr) services, interacting with state stores, and pub/sub brokers.
[Learn more about workflow activities.]({{< ref "workflow-features-concepts.md##workflow-activities" >}})
@ -47,7 +47,7 @@ In addition to activities, you can write workflows to schedule other workflows a
### Timers and reminders
Same as Dapr actors, you can schedule reminder-like durable delays for any time range.
Same as Dapr actors, you can schedule reminder-like durable delays for any time range.
[Learn more about workflow timers]({{< ref "workflow-features-concepts.md#durable-timers" >}}) and [reminders]({{< ref "workflow-architecture.md#reminder-usage-and-execution-guarantees" >}})
@ -81,6 +81,8 @@ You can use the following SDKs to author a workflow.
| Language stack | Package |
| - | - |
| .NET | [Dapr.Workflow](https://www.nuget.org/profiles/dapr.io) |
| Python | [dapr-ext-workflow](https://github.com/dapr/python-sdk/tree/master/ext/dapr-ext-workflow) |
## Try out workflows
@ -92,6 +94,8 @@ Want to put workflows to the test? Walk through the following quickstart and tut
| ------------------- | ----------- |
| [Workflow quickstart]({{< ref workflow-quickstart.md >}}) | Run a .NET workflow application with four workflow activities to see Dapr Workflow in action |
| [Workflow .NET SDK example](https://github.com/dapr/dotnet-sdk/tree/master/examples/Workflow) | Learn how to create a Dapr Workflow and invoke it using ASP.NET Core web APIs. |
| [Workflow Python SDK example](https://github.com/dapr/python-sdk/tree/master/examples/demo_workflow) | Learn how to create a Dapr Workflow and invoke it using the Python `DaprClient` package. |
### Start using workflows directly in your app
@ -110,4 +114,6 @@ Watch [this video for an overview on Dapr Workflow](https://youtu.be/s1p9MNl4VGo
## Related links
- [Workflow API reference]({{< ref workflow_api.md >}})
- [Try out the .NET example](https://github.com/dapr/dotnet-sdk/tree/master/examples/Workflow)
- Try out the full SDK examples:
- [.NET example](https://github.com/dapr/dotnet-sdk/tree/master/examples/Workflow)
- [Python example](https://github.com/dapr/python-sdk/tree/master/examples/demo_workflow)

View File

@ -12,10 +12,10 @@ Dapr Workflows simplify complex, stateful coordination requirements in microserv
In the task chaining pattern, multiple steps in a workflow are run in succession, and the output of one step may be passed as the input to the next step. Task chaining workflows typically involve creating a sequence of operations that need to be performed on some data, such as filtering, transforming, and reducing.
In some cases, the steps of the workflow may need to be orchestrated across multiple microservices. For increased reliability and scalability, you're also likely to use queues to trigger the various steps.
<img src="/images/workflow-overview/workflows-chaining.png" width=800 alt="Diagram showing how the task chaining workflow pattern works">
In some cases, the steps of the workflow may need to be orchestrated across multiple microservices. For increased reliability and scalability, you're also likely to use queues to trigger the various steps.
While the pattern is simple, there are many complexities hidden in the implementation. For example:
- What happens if one of the microservices are unavailable for an extended period of time?

View File

@ -64,8 +64,9 @@ cd ./crypto-quickstart
```
The application code defines two required keys:
- Private RSA key
- A 256-bit symmetric (AES) key
- A 256-bit symmetric (AES) key
Generate two keys, an RSA key and and AES key using OpenSSL and write these to two files:

View File

@ -28,7 +28,7 @@ In this guide, you'll:
Currently, you can experience the Dapr Workflow using the .NET SDK.
{{< tabs ".NET" >}}
{{< tabs ".NET" "Python" >}}
<!-- .NET -->
{{% codetab %}}
@ -254,8 +254,234 @@ The `Activities` directory holds the four workflow activities used by the workfl
- `ProcessPaymentActivity.cs`
- `UpdateInventoryActivity.cs`
{{% /codetab %}}
<!-- Python -->
{{% codetab %}}
### Step 1: Pre-requisites
For this example, you will need:
- [Dapr CLI and initialized environment](https://docs.dapr.io/getting-started).
- [Python 3.7+ installed](https://www.python.org/downloads/).
<!-- IGNORE_LINKS -->
- [Docker Desktop](https://www.docker.com/products/docker-desktop)
<!-- END_IGNORE -->
### Step 2: Set up the environment
Clone the [sample provided in the Quickstarts repo](https://github.com/dapr/quickstarts/tree/master/workflows).
```bash
git clone https://github.com/dapr/quickstarts.git
```
In a new terminal window, navigate to the `order-processor` directory:
```bash
cd workflows/python/sdk/order-processor
```
Install the Dapr Python SDK package:
```bash
pip3 install -r requirements.txt
```
### Step 3: Run the order processor app
In the terminal, start the order processor app alongside a Dapr sidecar:
```bash
dapr run --app-id order-processor --resources-path ../../../components/ -- python3 app.py
```
> **Note:** Since Python3.exe is not defined in Windows, you may need to use `python app.py` instead of `python3 app.py`.
This starts the `order-processor` app with unique workflow ID and runs the workflow activities.
Expected output:
```bash
== APP == Starting order workflow, purchasing 10 of cars
== APP == 2023-06-06 09:35:52.945 durabletask-worker INFO: Successfully connected to 127.0.0.1:65406. Waiting for work items...
== APP == INFO:NotifyActivity:Received order f4e1926e-3721-478d-be8a-f5bebd1995da for 10 cars at $150000 !
== APP == INFO:VerifyInventoryActivity:Verifying inventory for order f4e1926e-3721-478d-be8a-f5bebd1995da of 10 cars
== APP == INFO:VerifyInventoryActivity:There are 100 Cars available for purchase
== APP == INFO:RequestApprovalActivity:Requesting approval for payment of 165000 USD for 10 cars
== APP == 2023-06-06 09:36:05.969 durabletask-worker INFO: f4e1926e-3721-478d-be8a-f5bebd1995da Event raised: manager_approval
== APP == INFO:NotifyActivity:Payment for order f4e1926e-3721-478d-be8a-f5bebd1995da has been approved!
== APP == INFO:ProcessPaymentActivity:Processing payment: f4e1926e-3721-478d-be8a-f5bebd1995da for 10 cars at 150000 USD
== APP == INFO:ProcessPaymentActivity:Payment for request ID f4e1926e-3721-478d-be8a-f5bebd1995da processed successfully
== APP == INFO:UpdateInventoryActivity:Checking inventory for order f4e1926e-3721-478d-be8a-f5bebd1995da for 10 cars
== APP == INFO:UpdateInventoryActivity:There are now 90 cars left in stock
== APP == INFO:NotifyActivity:Order f4e1926e-3721-478d-be8a-f5bebd1995da has completed!
== APP == 2023-06-06 09:36:06.106 durabletask-worker INFO: f4e1926e-3721-478d-be8a-f5bebd1995da: Orchestration completed with status: COMPLETED
== APP == Workflow completed! Result: Completed
== APP == Purchase of item is Completed
```
### (Optional) Step 4: View in Zipkin
If you have Zipkin configured for Dapr locally on your machine, you can view the workflow trace spans in the Zipkin web UI (typically at `http://localhost:9411/zipkin/`).
<img src="/images/workflow-trace-spans-zipkin-python.png" width=900 style="padding-bottom:15px;">
### What happened?
When you ran `dapr run --app-id order-processor --resources-path ../../../components/ -- python3 app.py`:
1. A unique order ID for the workflow is generated (in the above example, `f4e1926e-3721-478d-be8a-f5bebd1995da`) and the workflow is scheduled.
1. The `NotifyActivity` workflow activity sends a notification saying an order for 10 cars has been received.
1. The `ReserveInventoryActivity` workflow activity checks the inventory data, determines if you can supply the ordered item, and responds with the number of cars in stock.
1. Your workflow starts and notifies you of its status.
1. The `ProcessPaymentActivity` workflow activity begins processing payment for order `f4e1926e-3721-478d-be8a-f5bebd1995da` and confirms if successful.
1. The `UpdateInventoryActivity` workflow activity updates the inventory with the current available cars after the order has been processed.
1. The `NotifyActivity` workflow activity sends a notification saying that order `f4e1926e-3721-478d-be8a-f5bebd1995da` has completed.
1. The workflow terminates as completed.
#### `order-processor/app.py`
In the application's program file:
- The unique workflow order ID is generated
- The workflow is scheduled
- The workflow status is retrieved
- The workflow and the workflow activities it invokes are registered
```python
class WorkflowConsoleApp:
def main(self):
# Register workflow and activities
workflowRuntime = WorkflowRuntime(settings.DAPR_RUNTIME_HOST, settings.DAPR_GRPC_PORT)
workflowRuntime.register_workflow(order_processing_workflow)
workflowRuntime.register_activity(notify_activity)
workflowRuntime.register_activity(requst_approval_activity)
workflowRuntime.register_activity(verify_inventory_activity)
workflowRuntime.register_activity(process_payment_activity)
workflowRuntime.register_activity(update_inventory_activity)
workflowRuntime.start()
print("==========Begin the purchase of item:==========", flush=True)
item_name = default_item_name
order_quantity = 10
total_cost = int(order_quantity) * baseInventory[item_name].per_item_cost
order = OrderPayload(item_name=item_name, quantity=int(order_quantity), total_cost=total_cost)
# Start Workflow
print(f'Starting order workflow, purchasing {order_quantity} of {item_name}', flush=True)
start_resp = daprClient.start_workflow(workflow_component=workflow_component,
workflow_name=workflow_name,
input=order)
_id = start_resp.instance_id
def prompt_for_approval(daprClient: DaprClient):
daprClient.raise_workflow_event(instance_id=_id, workflow_component=workflow_component,
event_name="manager_approval", event_data={'approval': True})
approval_seeked = False
start_time = datetime.now()
while True:
time_delta = datetime.now() - start_time
state = daprClient.get_workflow(instance_id=_id, workflow_component=workflow_component)
if not state:
print("Workflow not found!") # not expected
elif state.runtime_status == "Completed" or\
state.runtime_status == "Failed" or\
state.runtime_status == "Terminated":
print(f'Workflow completed! Result: {state.runtime_status}', flush=True)
break
if time_delta.total_seconds() >= 10:
state = daprClient.get_workflow(instance_id=_id, workflow_component=workflow_component)
if total_cost > 50000 and (
state.runtime_status != "Completed" or
state.runtime_status != "Failed" or
state.runtime_status != "Terminated"
) and not approval_seeked:
approval_seeked = True
threading.Thread(target=prompt_for_approval(daprClient), daemon=True).start()
print("Purchase of item is ", state.runtime_status, flush=True)
def restock_inventory(self, daprClient: DaprClient, baseInventory):
for key, item in baseInventory.items():
print(f'item: {item}')
item_str = f'{{"name": "{item.item_name}", "quantity": {item.quantity},\
"per_item_cost": {item.per_item_cost}}}'
daprClient.save_state("statestore-actors", key, item_str)
if __name__ == '__main__':
app = WorkflowConsoleApp()
app.main()
```
#### `order-processor/workflow.py`
In `workflow.py`, the workflow is defined as a class with all of its associated tasks (determined by workflow activities).
```python
def order_processing_workflow(ctx: DaprWorkflowContext, order_payload_str: OrderPayload):
"""Defines the order processing workflow.
When the order is received, the inventory is checked to see if there is enough inventory to
fulfill the order. If there is enough inventory, the payment is processed and the inventory is
updated. If there is not enough inventory, the order is rejected.
If the total order is greater than $50,000, the order is sent to a manager for approval.
"""
order_id = ctx.instance_id
order_payload=json.loads(order_payload_str)
yield ctx.call_activity(notify_activity,
input=Notification(message=('Received order ' +order_id+ ' for '
+f'{order_payload["quantity"]}' +' ' +f'{order_payload["item_name"]}'
+' at $'+f'{order_payload["total_cost"]}' +' !')))
result = yield ctx.call_activity(verify_inventory_activity,
input=InventoryRequest(request_id=order_id,
item_name=order_payload["item_name"],
quantity=order_payload["quantity"]))
if not result.success:
yield ctx.call_activity(notify_activity,
input=Notification(message='Insufficient inventory for '
+f'{order_payload["item_name"]}'+'!'))
return OrderResult(processed=False)
if order_payload["total_cost"] > 50000:
yield ctx.call_activity(requst_approval_activity, input=order_payload)
approval_task = ctx.wait_for_external_event("manager_approval")
timeout_event = ctx.create_timer(timedelta(seconds=200))
winner = yield when_any([approval_task, timeout_event])
if winner == timeout_event:
yield ctx.call_activity(notify_activity,
input=Notification(message='Payment for order '+order_id
+' has been cancelled due to timeout!'))
return OrderResult(processed=False)
approval_result = yield approval_task
if approval_result["approval"]:
yield ctx.call_activity(notify_activity, input=Notification(
message=f'Payment for order {order_id} has been approved!'))
else:
yield ctx.call_activity(notify_activity, input=Notification(
message=f'Payment for order {order_id} has been rejected!'))
return OrderResult(processed=False)
yield ctx.call_activity(process_payment_activity, input=PaymentRequest(
request_id=order_id, item_being_purchased=order_payload["item_name"],
amount=order_payload["total_cost"], quantity=order_payload["quantity"]))
try:
yield ctx.call_activity(update_inventory_activity,
input=PaymentRequest(request_id=order_id,
item_being_purchased=order_payload["item_name"],
amount=order_payload["total_cost"],
quantity=order_payload["quantity"]))
except Exception:
yield ctx.call_activity(notify_activity,
input=Notification(message=f'Order {order_id} Failed!'))
return OrderResult(processed=False)
yield ctx.call_activity(notify_activity, input=Notification(
message=f'Order {order_id} has completed!'))
return OrderResult(processed=True)
```
{{% /codetab %}}

View File

@ -6,14 +6,18 @@ weight: 4500
description: "Choose which Dapr sidecar APIs are available to the app"
---
In certain scenarios such as zero trust networks or when exposing the Dapr sidecar to external traffic through a frontend, it's recommended to only enable the Dapr sidecar APIs that are being used by the app. Doing so reduces the attack surface and helps keep the Dapr APIs scoped to the actual needs of the application.
In certain scenarios, such as zero trust networks or when exposing the Dapr sidecar to external traffic through a frontend, it's recommended to only enable the Dapr sidecar APIs that are being used by the app. Doing so reduces the attack surface and helps keep the Dapr APIs scoped to the actual needs of the application.
Dapr allows developers to control which APIs are accessible to the application by setting an API allow list using a [Dapr Configuration]({{<ref "configuration-overview.md">}}).
Dapr allows developers to control which APIs are accessible to the application by setting an API allowlist or denylist using a [Dapr Configuration]({{<ref "configuration-overview.md">}}).
### Default behavior
If an API allow list section is not specified, the default behavior is to allow access to all Dapr APIs.
Once an allow list is set, only the specified APIs are accessible.
If no API allowlist or denylist is specified, the default behavior is to allow access to all Dapr APIs.
- If only a denylist is defined, all Dapr APIs are allowed except those defined in the denylist
- If only an allowlist is defined, only the Dapr APIs listed in the allowlist are allowed
- If both an allowlist and a denylist are defined, the allowed APIs are those defined in the allowlist, unless they are also included in the denylist. In other words, the denylist overrides the allowlist for APIs that are defined in both.
- If neither is defined, all APIs are allowed.
For example, the following configuration enables all APIs for both HTTP and gRPC:
@ -28,9 +32,11 @@ spec:
samplingRate: "1"
```
### Enabling specific HTTP APIs
### Using an allowlist
The following example enables the state `v1.0` HTTP API and block all the rest:
#### Enabling specific HTTP APIs
The following example enables the state `v1.0` HTTP API and blocks all other HTTP APIs:
```yaml
apiVersion: dapr.io/v1alpha1
@ -41,14 +47,14 @@ metadata:
spec:
api:
allowed:
- name: state
version: v1.0
protocol: http
- name: state
version: v1.0
protocol: http
```
### Enabling specific gRPC APIs
#### Enabling specific gRPC APIs
The following example enables the state `v1` gRPC API and block all the rest:
The following example enables the state `v1` gRPC API and blocks all other gRPC APIs:
```yaml
apiVersion: dapr.io/v1alpha1
@ -59,9 +65,47 @@ metadata:
spec:
api:
allowed:
- name: state
version: v1
protocol: grpc
- name: state
version: v1
protocol: grpc
```
### Using a denylist
#### Disabling specific HTTP APIs
The following example disables the state `v1.0` HTTP API, allowing all other HTTP APIs:
```yaml
apiVersion: dapr.io/v1alpha1
kind: Configuration
metadata:
name: myappconfig
namespace: default
spec:
api:
denied:
- name: state
version: v1.0
protocol: http
```
#### Disabling specific gRPC APIs
The following example disables the state `v1` gRPC API, allowing all other gRPC APIs:
```yaml
apiVersion: dapr.io/v1alpha1
kind: Configuration
metadata:
name: myappconfig
namespace: default
spec:
api:
denied:
- name: state
version: v1
protocol: grpc
```
### List of Dapr APIs
@ -70,12 +114,18 @@ The `name` field takes the name of the Dapr API you would like to enable.
See this list of values corresponding to the different Dapr APIs:
| Name | Dapr API |
| ------------- | ------------- |
| state | [State]({{< ref state_api.md>}})|
| invoke | [Service Invocation]({{< ref service_invocation_api.md >}}) |
| secrets | [Secrets]({{< ref secrets_api.md >}})|
| bindings | [Output Bindings]({{< ref bindings_api.md >}}) |
| publish | [Pub/Sub]({{< ref pubsub.md >}}) |
| actors | [Actors]({{< ref actors_api.md >}}) |
| metadata | [Metadata]({{< ref metadata_api.md >}}) |
| API group | HTTP API | [gRPC API](https://github.com/dapr/dapr/blob/master/pkg/grpc/endpoints.go) |
| ----- | ----- | ----- |
| [Service Invocation]({{< ref service_invocation_api.md >}}) | `invoke` (`v1.0`) | `invoke` (`v1`) |
| [State]({{< ref state_api.md>}})| `state` (`v1.0` and `v1.0-alpha1`) | `state` (`v1` and `v1alpha1`) |
| [Pub/Sub]({{< ref pubsub.md >}}) | `publish` (`v1.0` and `v1.0-alpha1`) | `publish` (`v1` and `v1alpha1`) |
| [(Output) Bindings]({{< ref bindings_api.md >}}) | `bindings` (`v1.0`) |`bindings` (`v1`) |
| [Secrets]({{< ref secrets_api.md >}})| `secrets` (`v1.0`) | `secrets` (`v1`) |
| [Actors]({{< ref actors_api.md >}}) | `actors` (`v1.0`) |`actors` (`v1`) |
| [Metadata]({{< ref metadata_api.md >}}) | `metadata` (`v1.0`) |`metadata` (`v1`) |
| [Configuration]({{< ref configuration_api.md >}}) | `configuration` (`v1.0` and `v1.0-alpha1`) | `configuration` (`v1` and `v1alpha1`) |
| [Distributed Lock]({{< ref distributed_lock_api.md >}}) | `lock` (`v1.0-alpha1`)<br/>`unlock` (`v1.0-alpha1`) | `lock` (`v1alpha1`)<br/>`unlock` (`v1alpha1`) |
| Cryptography | `crypto` (`v1.0-alpha1`) | `crypto` (`v1alpha1`) |
| [Workflow]({{< ref workflow_api.md >}}) | `workflows` (`v1.0-alpha1`) |`workflows` (`v1alpha1`) |
| [Health]({{< ref health_api.md >}}) | `healthz` (`v1.0`) | n/a |
| Shutdown | `shutdown` (`v1.0`) | `shutdown` (`v1`) |

View File

@ -58,6 +58,18 @@ dapr init -k
✅ Success! Dapr has been installed to namespace dapr-system. To verify, run "dapr status -k" in your terminal. To get started, go here: https://aka.ms/dapr-getting-started
```
To run the dashboard, run:
```bash
dapr dashboard -k
```
If you installed Dapr in a non-default namespace, run:
```bash
dapr dashboard -k -n <your-namespace>
```
### Install Dapr (a private Dapr Helm chart)
There are some scenarios where it's necessary to install Dapr from a private Helm chart, such as:
- needing more granular control of the Dapr Helm chart
@ -125,7 +137,7 @@ The latest Dapr helm chart no longer supports Helm v2. Please migrate from Helm
### Add and install Dapr Helm chart
1. Make sure [Helm 3](https://github.com/helm/helm/releases) is installed on your machine
2. Add Helm repo and update
1. Add Helm repo and update
```bash
// Add the official Dapr Helm chart.
@ -134,10 +146,11 @@ The latest Dapr helm chart no longer supports Helm v2. Please migrate from Helm
helm repo add dapr http://helm.custom-domain.com/dapr/dapr/ \
--username=xxx --password=xxx
helm repo update
# See which chart versions are available
// See which chart versions are available
helm search repo dapr --devel --versions
```
3. Install the Dapr chart on your cluster in the `dapr-system` namespace.
1. Install the Dapr chart on your cluster in the `dapr-system` namespace.
```bash
helm upgrade --install dapr dapr/dapr \
@ -158,8 +171,7 @@ The latest Dapr helm chart no longer supports Helm v2. Please migrate from Helm
--wait
```
See [Guidelines for production ready deployments on Kubernetes]({{<ref kubernetes-production.md>}}) for more information on installing and upgrading Dapr using Helm.
See [Guidelines for production ready deployments on Kubernetes]({{< ref kubernetes-production.md >}}) for more information on installing and upgrading Dapr using Helm.
### Uninstall Dapr on Kubernetes
@ -172,6 +184,22 @@ helm uninstall dapr --namespace dapr-system
- Read [this guide]({{< ref kubernetes-production.md >}}) for recommended Helm chart values for production setups
- See [this page](https://github.com/dapr/dapr/blob/master/charts/dapr/README.md) for details on Dapr Helm charts.
## Installing the Dapr dashboard as part of the control plane
If you want to install the Dapr dashboard, use this Helm chart with the additional settings of your choice:
`helm install dapr dapr/dapr-dashboard --namespace dapr-system`
For example:
```bash
helm repo add dapr https://dapr.github.io/helm-charts/
helm repo update
kubectl create namespace dapr-system
# Install the Dapr dashboard
helm install dapr dapr/dapr-dashboard --namespace dapr-system
```
## Verify installation
Once the installation is complete, verify that the dapr-operator, dapr-placement, dapr-sidecar-injector and dapr-sentry pods are running in the `dapr-system` namespace:

View File

@ -75,4 +75,4 @@ By default, tailing is set to /var/log/containers/*.log. To change this setting,
* [Telemetry Data Platform](https://newrelic.com/platform/telemetry-data-platform)
* [New Relic Logging](https://github.com/newrelic/helm-charts/tree/master/charts/newrelic-logging)
* [Types of New Relic API keys](https://docs.newrelic.com/docs/apis/intro-apis/new-relic-api-keys/)
* [Alerts and Applied Intelligence](https://docs.newrelic.com/docs/alerts-applied-intelligence/new-relic-alerts/learn-alerts/alerts-ai-transition-guide-2022/)
* [Alerts and Applied Intelligence](https://docs.newrelic.com/docs/alerts-applied-intelligence/overview/)

View File

@ -40,4 +40,4 @@ This document explains how to install it in your cluster, either using a Helm ch
* [Telemetry Data Platform](https://newrelic.com/platform/telemetry-data-platform)
* [New Relic Prometheus OpenMetrics Integration](https://github.com/newrelic/helm-charts/tree/master/charts/nri-prometheus)
* [Types of New Relic API keys](https://docs.newrelic.com/docs/apis/intro-apis/new-relic-api-keys/)
* [Alerts and Applied Intelligence](https://docs.newrelic.com/docs/alerts-applied-intelligence/new-relic-alerts/learn-alerts/alerts-ai-transition-guide-2022/)
* [Alerts and Applied Intelligence](https://docs.newrelic.com/docs/alerts-applied-intelligence/overview/)

View File

@ -101,7 +101,7 @@ And the exact same dashboard templates from Dapr can be imported to visualize Da
## New Relic Alerts
All the data that is collected from Dapr, Kubernetes or any services that run on top of can be used to set-up alerts and notifications into the preferred channel of your choice. See [Alerts and Applied Intelligence](https://docs.newrelic.com/docs/alerts-applied-intelligence/new-relic-alerts/learn-alerts/alerts-ai-transition-guide-2022/).
All the data that is collected from Dapr, Kubernetes or any services that run on top of can be used to set-up alerts and notifications into the preferred channel of your choice. See [Alerts and Applied Intelligence](https://docs.newrelic.com/docs/alerts-applied-intelligence/overview/).
## Related Links/References
@ -111,4 +111,4 @@ All the data that is collected from Dapr, Kubernetes or any services that run on
* [New Relic Trace API](https://docs.newrelic.com/docs/distributed-tracing/trace-api/introduction-trace-api/)
* [Types of New Relic API keys](https://docs.newrelic.com/docs/apis/intro-apis/new-relic-api-keys/)
* [New Relic OpenTelemetry User Experience](https://blog.newrelic.com/product-news/opentelemetry-user-experience/)
* [Alerts and Applied Intelligence](https://docs.newrelic.com/docs/alerts-applied-intelligence/new-relic-alerts/learn-alerts/alerts-ai-transition-guide-2022/)
* [Alerts and Applied Intelligence](https://docs.newrelic.com/docs/alerts-applied-intelligence/overview/)

View File

@ -39,6 +39,12 @@ There is a process for applying breaking changes:
- For example, feature X is announced to be deprecated in the 1.0.0 release notes and will then be removed in 1.2.0.
## Deprecations
Deprecations can apply to
1. APIs, including alpha APIs
1. Preview features
1. Components
1. CLI
1. Features that could result in security vulnerabilities
Deprecations appear in release notes under a section named “Deprecations”, which indicates:
@ -56,6 +62,7 @@ After announcing a future breaking change, the change will happen in 2 releases
| Java domain builder classes deprecated (Users should use [setters](https://github.com/dapr/java-sdk/issues/587) instead) | Java SDK 1.3.0 | Java SDK 1.5.0 |
| Service invocation will no longer provide a default content type header of `application/json` when no content-type is specified. You must explicitly [set a content-type header]({{< ref "service_invocation_api.md#request-contents" >}}) for service invocation if your invoked apps rely on this header. | 1.7.0 | 1.9.0 |
| gRPC service invocation using `invoke` method is deprecated. Use proxy mode service invocation instead. See [How-To: Invoke services using gRPC ]({{< ref howto-invoke-services-grpc.md >}}) to use the proxy mode.| 1.9.0 | 1.10.0 |
| The CLI flag `--app-ssl` (in both the Dapr CLI and daprd) has been deprecated in favor of using `--app-protocol` with values `https` or `grpcs`. [daprd:6158](https://github.com/dapr/dapr/issues/6158) [cli:1267](https://github.com/dapr/cli/issues/1267)| 1.11.0 | 1.13.0 |
## Related links

View File

@ -58,8 +58,12 @@ The [components-contrib](https://github.com/dapr/components-contrib/) repo relea
Note: Components have a production usage lifecycle status: Alpha, Beta and Stable. These statuses are not related to their versioning. The tables of supported components shows both their versions and their status.
* List of [state store components]({{< ref supported-state-stores.md >}})
* List of [pub/sub components]({{< ref supported-pubsub.md >}})
* List of [secret store components]({{< ref supported-secret-stores.md >}})
* List of [binding components]({{< ref supported-bindings.md >}})
* List of [secret store components]({{< ref supported-secret-stores.md >}})
* List of [configuration store components]({{< ref supported-configuration-stores.md >}})
* List of [lock components]({{< ref supported-locks.md >}})
* List of [crytpography components]({{< ref supported-cryptography.md >}})
* List of [middleware components]({{< ref supported-middleware.md >}})
For more information on component versioning read [Version 2 and beyond of a component](https://github.com/dapr/components-contrib/blob/master/docs/developing-component.md#version-2-and-beyond-of-a-component)
@ -96,6 +100,8 @@ The version for a component implementation is determined by the `.spec.version`
### Component deprecations
Deprecations of components will be announced two (2) releases ahead. Deprecation of a component, results in major version update of the component version. After 2 releases, the component is unregistered from the Dapr runtime, and trying to load it will throw a fatal exception.
Component deprecations and removal are announced in the release notes.
## Quickstarts and Samples
Quickstarts in the [Quickstarts repo](https://github.com/dapr/quickstarts) are versioned with the runtime, where a table of corresponding versions is on the front page of the samples repo. Users should only use Quickstarts corresponding to the version of the runtime being run.
@ -103,3 +109,4 @@ Samples in the [Samples repo](https://github.com/dapr/samples) are each versione
## Related links
* Read the [Supported releases]({{< ref support-release-policy.md >}})
* Read the [Breaking Changes and Deprecation Policy]({{< ref breaking-changes-and-deprecations.md >}})

View File

@ -67,6 +67,10 @@ spec:
| oidcClientID | N | Input/Output | The OAuth2 client ID that has been provisioned in the identity provider. Required when `authType` is set to `oidc` | `dapr-kafka` |
| oidcClientSecret | N | Input/Output | The OAuth2 client secret that has been provisioned in the identity provider: Required when `authType` is set to `oidc` | `"KeFg23!"` |
| oidcScopes | N | Input/Output | Comma-delimited list of OAuth2/OIDC scopes to request with the access token. Recommended when `authType` is set to `oidc`. Defaults to `"openid"` | `"openid,kafka-prod"` |
| version | N | Input/Output | Kafka cluster version. Defaults to 2.0.0. Please note that this needs to be mandatorily set to `1.0.0` for EventHubs with Kafka. | `1.0.0` |
#### Note
The metadata `version` must be set to `1.0.0` when using Azure EventHubs with Kafka.
## Binding support

View File

@ -65,7 +65,7 @@ spec:
| maxMessageBytes | N | The maximum size in bytes allowed for a single Kafka message. Defaults to 1024. | `2048`
| consumeRetryInterval | N | The interval between retries when attempting to consume topics. Treats numbers without suffix as milliseconds. Defaults to 100ms. | `200ms` |
| consumeRetryEnabled | N | Disable consume retry by setting `"false"` | `"true"`, `"false"` |
| version | N | Kafka cluster version. Defaults to 2.0.0.0 | `0.10.2.0` |
| version | N | Kafka cluster version. Defaults to 2.0.0. Note that this must be set to `1.0.0` if you are using Azure EventHubs with Kafka. | `0.10.2.0` |
| caCert | N | Certificate authority certificate, required for using TLS. Can be `secretKeyRef` to use a secret reference | `"-----BEGIN CERTIFICATE-----\n<base64-encoded DER>\n-----END CERTIFICATE-----"`
| clientCert | N | Client certificate, required for `authType` `mtls`. Can be `secretKeyRef` to use a secret reference | `"-----BEGIN CERTIFICATE-----\n<base64-encoded DER>\n-----END CERTIFICATE-----"`
| clientKey | N | Client key, required for `authType` `mtls` Can be `secretKeyRef` to use a secret reference | `"-----BEGIN RSA PRIVATE KEY-----\n<base64-encoded PKCS8>\n-----END RSA PRIVATE KEY-----"`
@ -78,6 +78,9 @@ spec:
The `secretKeyRef` above is referencing a [kubernetes secrets store]({{< ref kubernetes-secret-store.md >}}) to access the tls information. Visit [here]({{< ref setup-secret-store.md >}}) to learn more about how to configure a secret store component.
#### Note
The metadata `version` must be set to `1.0.0` when using Azure EventHubs with Kafka.
### Authentication
Kafka supports a variety of authentication schemes and Dapr supports several: SASL password, mTLS, OIDC/OAuth2. With the added authentication methods, the `authRequired` field has

View File

@ -77,6 +77,9 @@ spec:
| batchingMaxSize | N | batchingMaxSize sets the maximum number of bytes permitted in a batch. If set to a value greater than 1, messages will be queued until this threshold is reached or batchingMaxMessages (see above) has been reached or the batch interval has elapsed. Default: `"128KB"` | `"131072"`|
| <topic-name>.jsonschema | N | Enforces JSON schema validation for the configured topic. |
| <topic-name>.avroschema | N | Enforces Avro schema validation for the configured topic. |
| publicKey | N | A public key to be used for publisher and consumer encryption. Value can be one of two options: file path for a local PEM cert, or the cert data string value |
| privateKey | N | A private key to be used for consumer encryption. Value can be one of two options: file path for a local PEM cert, or the cert data string value |
| keys | N | A comma delimited string containing names of [Pulsar session keys](https://pulsar.apache.org/docs/3.0.x/security-encryption/#how-it-works-in-pulsar). Used in conjunction with `publicKey` for publisher encryption |
### Enabling message delivery retries
@ -115,6 +118,90 @@ curl -X POST http://localhost:3500/v1.0/publish/myPulsar/myTopic?metadata.delive
}'
```
### E2E Encryption
Dapr supports setting public and private key pairs to enable Pulsar's [end-to-end encryption feature](https://pulsar.apache.org/docs/3.0.x/security-encryption/).
#### Enabling publisher encryption from file certs
```yaml
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
name: messagebus
spec:
type: pubsub.pulsar
version: v1
metadata:
- name: host
value: "localhost:6650"
- name: publicKey
value: ./public.key
- name: keys
value: myapp.key
```
#### Enabling consumer encryption from file certs
```yaml
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
name: messagebus
spec:
type: pubsub.pulsar
version: v1
metadata:
- name: host
value: "localhost:6650"
- name: publicKey
value: ./public.key
- name: privateKey
value: ./private.key
```
#### Enabling publisher encryption from value
> Note: It is recommended to [reference the public key from a secret]({{< ref component-secrets.md >}}).
```yaml
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
name: messagebus
spec:
type: pubsub.pulsar
version: v1
metadata:
- name: host
value: "localhost:6650"
- name: publicKey
value: "-----BEGIN PUBLIC KEY-----\nMIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEA1KDAM4L8RtJ+nLaXBrBh\nzVpvTemsKVZoAct8A+ShepOHT9lgHOCGLFGWNla6K6j+b3AV/P/fAAhwj82vwTDd\nruXSflvSdmYeFAw3Ypphc1A5oM53wSRWhg63potBNWqdDzj8ApYgqjpmjYSQdL5/\na3golb36GYFrY0MLFTv7wZ87pmMIPsOgGIcPbCHker2fRZ34WXYLb1hkeUpwx4eK\njpwcg35gccvR6o/UhbKAuc60V1J9Wof2sNgtlRaQej45wnpjWYzZrIyk5qUbn0Qi\nCdpIrXvYtANq0Id6gP8zJvUEdPIgNuYxEmVCl9jI+8eGI6peD0qIt8U80hf9axhJ\n3QIDAQAB\n-----END PUBLIC KEY-----\n"
- name: keys
value: myapp.key
```
#### Enabling consumer encryption from value
> Note: It is recommended to [reference the public and private keys from a secret]({{< ref component-secrets.md >}}).
```yaml
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
name: messagebus
spec:
type: pubsub.pulsar
version: v1
metadata:
- name: host
value: "localhost:6650"
- name: publicKey
value: "-----BEGIN PUBLIC KEY-----\nMIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEA1KDAM4L8RtJ+nLaXBrBh\nzVpvTemsKVZoAct8A+ShepOHT9lgHOCGLFGWNla6K6j+b3AV/P/fAAhwj82vwTDd\nruXSflvSdmYeFAw3Ypphc1A5oM53wSRWhg63potBNWqdDzj8ApYgqjpmjYSQdL5/\na3golb36GYFrY0MLFTv7wZ87pmMIPsOgGIcPbCHker2fRZ34WXYLb1hkeUpwx4eK\njpwcg35gccvR6o/UhbKAuc60V1J9Wof2sNgtlRaQej45wnpjWYzZrIyk5qUbn0Qi\nCdpIrXvYtANq0Id6gP8zJvUEdPIgNuYxEmVCl9jI+8eGI6peD0qIt8U80hf9axhJ\n3QIDAQAB\n-----END PUBLIC KEY-----\n"
- name: privateKey
value: "-----BEGIN RSA PRIVATE KEY-----\nMIIEpAIBAAKCAQEA1KDAM4L8RtJ+nLaXBrBhzVpvTemsKVZoAct8A+ShepOHT9lg\nHOCGLFGWNla6K6j+b3AV/P/fAAhwj82vwTDdruXSflvSdmYeFAw3Ypphc1A5oM53\nwSRWhg63potBNWqdDzj8ApYgqjpmjYSQdL5/a3golb36GYFrY0MLFTv7wZ87pmMI\nPsOgGIcPbCHker2fRZ34WXYLb1hkeUpwx4eKjpwcg35gccvR6o/UhbKAuc60V1J9\nWof2sNgtlRaQej45wnpjWYzZrIyk5qUbn0QiCdpIrXvYtANq0Id6gP8zJvUEdPIg\nNuYxEmVCl9jI+8eGI6peD0qIt8U80hf9axhJ3QIDAQABAoIBAQCKuHnM4ac/eXM7\nQPDVX1vfgyHc3hgBPCtNCHnXfGFRvFBqavKGxIElBvGOcBS0CWQ+Rg1Ca5kMx3TQ\njSweSYhH5A7pe3Sa5FK5V6MGxJvRhMSkQi/lJZUBjzaIBJA9jln7pXzdHx8ekE16\nBMPONr6g2dr4nuI9o67xKrtfViwRDGaG6eh7jIMlEqMMc6WqyhvI67rlVDSTHFKX\njlMcozJ3IT8BtTzKg2Tpy7ReVuJEpehum8yn1ZVdAnotBDJxI07DC1cbOP4M2fHM\ngfgPYWmchauZuTeTFu4hrlY5jg0/WLs6by8r/81+vX3QTNvejX9UdTHMSIfQdX82\nAfkCKUVhAoGBAOvGv+YXeTlPRcYC642x5iOyLQm+BiSX4jKtnyJiTU2s/qvvKkIu\nxAOk3OtniT9NaUAHEZE9tI71dDN6IgTLQlAcPCzkVh6Sc5eG0MObqOO7WOMCWBkI\nlaAKKBbd6cGDJkwGCJKnx0pxC9f8R4dw3fmXWgWAr8ENiekMuvjSfjZ5AoGBAObd\ns2L5uiUPTtpyh8WZ7rEvrun3djBhzi+d7rgxEGdditeiLQGKyZbDPMSMBuus/5wH\nwfi0xUq50RtYDbzQQdC3T/C20oHmZbjWK5mDaLRVzWS89YG/NT2Q8eZLBstKqxkx\ngoT77zoUDfRy+CWs1xvXzgxagD5Yg8/OrCuXOqWFAoGAPIw3r6ELknoXEvihASxU\nS4pwInZYIYGXpygLG8teyrnIVOMAWSqlT8JAsXtPNaBtjPHDwyazfZrvEmEk51JD\nX0tA8M5ah1NYt+r5JaKNxp3P/8wUT6lyszyoeubWJsnFRfSusuq/NRC+1+KDg/aq\nKnSBu7QGbm9JoT2RrmBv5RECgYBRn8Lj1I1muvHTNDkiuRj2VniOSirkUkA2/6y+\nPMKi+SS0tqcY63v4rNCYYTW1L7Yz8V44U5mJoQb4lvpMbolGhPljjxAAU3hVkItb\nvGVRlSCIZHKczADD4rJUDOS7DYxO3P1bjUN4kkyYx+lKUMDBHFzCa2D6Kgt4dobS\n5qYajQKBgQC7u7MFPkkEMqNqNGu5erytQkBq1v1Ipmf9rCi3iIj4XJLopxMgw0fx\n6jwcwNInl72KzoUBLnGQ9PKGVeBcgEgdI+a+tq+1TJo6Ta+hZSx+4AYiKY18eRKG\neNuER9NOcSVJ7Eqkcw4viCGyYDm2vgNV9HJ0VlAo3RDh8x5spEN+mg==\n-----END RSA PRIVATE KEY-----\n"
```
## Create a Pulsar instance
{{< tabs "Self-Hosted" "Kubernetes">}}

View File

@ -5,26 +5,28 @@
<div class="col-6 col-sm-2 text-xs-center order-sm-2" style="margin-top: 1rem;">
{{ with $links }}
{{ with index . "user"}}
{{ template "footer-links-block" . }}
{{ template "footer-links-block" . }}
{{ end }}
{{ end }}
</div>
<div class="col-6 col-sm-2 text-right text-xs-center order-sm-3" style="margin-top: 1rem;">
{{ with $links }}
{{ with index . "developer"}}
{{ template "footer-links-block" . }}
{{ template "footer-links-block" . }}
{{ end }}
{{ end }}
</div>
<div class="col-12 col-sm-6 text-center py-2 order-sm-2">
{{ with .Site.Params }}<small class="text-white">&copy; {{ now.Year}} {{ .trademark | markdownify }}</small>{{ end }}
{{ if not .Site.Params.ui.footer_about_disable }}
{{ with .Site.GetPage "about" }}<p class="mt-2"><a href="{{ .RelPermalink }}">{{ .Title }}</a></p>{{ end }}
{{ end }}
{{ if not .Site.Params.ui.footer_about_disable }}
{{ with .Site.GetPage "about" }}<p class="mt-2"><a href="{{ .RelPermalink }}">{{ .Title }}</a></p>{{ end }}
{{ end }}
</div>
</div>
</div>
<img referrerpolicy="no-referrer-when-downgrade" src="https://static.scarf.sh/a.png?x-pxid=4848fb3b-3edb-4329-90a9-a9d79afff054" />
<script> (function (ss, ex) { window.ldfdr = window.ldfdr || function () { (ldfdr._q = ldfdr._q || []).push([].slice.call(arguments)); }; (function (d, s) { fs = d.getElementsByTagName(s)[0]; function ce(src) { var cs = d.createElement(s); cs.src = src; cs.async = 1; fs.parentNode.insertBefore(cs, fs); }; ce('https://sc.lfeeder.com/lftracker_v1_' + ss + (ex ? '_' + ex : '') + '.js'); })(document, 'script'); })('JMvZ8gblPjda2pOd'); </script>
<script async src="https://tag.clearbitscripts.com/v1/pk_3f4df076549ad932eda451778a42b09b/tags.js" referrerpolicy="strict-origin-when-cross-origin"></script>
</footer>
{{ define "footer-links-block" }}
<ul class="list-inline mb-0">

Binary file not shown.

Before

Width:  |  Height:  |  Size: 36 KiB

After

Width:  |  Height:  |  Size: 16 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 113 KiB

After

Width:  |  Height:  |  Size: 89 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 21 KiB

After

Width:  |  Height:  |  Size: 8.7 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 28 KiB

After

Width:  |  Height:  |  Size: 12 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 181 KiB

@ -1 +1 @@
Subproject commit f42b690f4c67e6bb4209932f660c46a96d0b0457
Subproject commit edb09a08b7a2ca63983f5237b307c40cae86d3bb

@ -1 +1 @@
Subproject commit d02f9524d96779350323128f88fefdd6fbd6787a
Subproject commit effc2f0d3c92ad76e11958e427c8d3b0900e1932

@ -1 +1 @@
Subproject commit 10d09619db5981ba45ec1268687c7a104cb338c1
Subproject commit d1c61cae40e7c5d933d92705198506d947960aaa

@ -1 +1 @@
Subproject commit ae34854397150715fe440dec831461d70e17d07a
Subproject commit 11145914a5fddd1ada06b6a3b938c27b401f7025

@ -1 +1 @@
Subproject commit 5190e79f4275ffcdaffe7efb59c21dffddefb5f8
Subproject commit 5051a9d5d92003924322a8ddbdf230fb8a872dd7