add component reference, update quickstarts per alice and mark review, update how-to, update diagrams with deepseek

Signed-off-by: Hannah Hunter <hannahhunter@microsoft.com>
This commit is contained in:
Hannah Hunter 2025-02-05 11:58:21 -05:00
parent 5e37ad6c56
commit 6083a4c4ac
6 changed files with 76 additions and 12 deletions

View File

@ -34,6 +34,28 @@ spec:
version: v1
```
### Use the OpenAI component
To interface with a real LLM, use one of the other [supported conversation components]({{< ref "supported-conversation" >}}), including OpenAI, Hugging Face, Anthropic, DeepSeek, and more.
For example, to swap out the `echo` mock component with an `OpenAI` component, replace the `conversation.yaml` file with the following. You'll need to copy your API key into the component file.
```
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
name: openai
spec:
type: conversation.openai
metadata:
- name: key
value: <REPLACE_WITH_YOUR_KEY>
- name: model
value: gpt-4-turbo
- name: cacheTTL
value: 10m
```
## Connect the conversation client
The following examples use an HTTP client to send a POST request to Dapr's sidecar HTTP endpoint. You can also use [the Dapr SDK client instead]({{< ref "#related-links" >}}).

View File

@ -10,7 +10,7 @@ description: Get started with the Dapr conversation building block
The conversation building block is currently in **alpha**.
{{% /alert %}}
Let's take a look at how the [Dapr conversation building block]({{< ref conversation-overview.md >}}) makes interacting with the LLM component easier. In this quickstart, you use the echo component to communicate with the LLM and ask it for a poem about Dapr.
Let's take a look at how the [Dapr conversation building block]({{< ref conversation-overview.md >}}) makes interacting with Large Language Models (LLMs) easier. In this quickstart, you use the echo component to communicate with the mock LLM and ask it for a poem about Dapr.
You can try out this conversation quickstart by either:
@ -31,7 +31,7 @@ You can try out this conversation quickstart by either:
For this example, you will need:
- [Dapr CLI and initialized environment](https://docs.dapr.io/getting-started).
- [.NET SDK or .NET 6 SDK installed](https://dotnet.microsoft.com/download).
- [.NET 8 SDK+ installed](https://dotnet.microsoft.com/download).
<!-- IGNORE_LINKS -->
- [Docker Desktop](https://www.docker.com/products/docker-desktop)
<!-- END_IGNORE -->
@ -67,9 +67,7 @@ dapr run -f .
### What happened?
When you ran `dapr init` during Dapr install, the following YAML files were generated in the `.dapr/components` directory:
- [`dapr.yaml` Multi-App Run template file]({{< ref "#dapryaml-multi-app-run-template-file" >}})
- [`pubsub.yaml` component file]({{< ref "#pubsubyaml-component-file" >}})
When you ran `dapr init` during Dapr install, the [`dapr.yaml` Multi-App Run template file]({{< ref "#dapryaml-multi-app-run-template-file" >}}) was generated in the `.dapr/components` directory.
Running `dapr run -f .` in this Quickstart started the [conversation Program.cs]({{< ref "#programcs-conversation-app" >}}).
@ -88,7 +86,7 @@ apps:
command: ["dotnet", "run"]
```
#### `conversation.yaml` LLM component
#### Echo mock LLM component
In [`conversation/components`](https://github.com/dapr/quickstarts/tree/master/conversation/components), the [`conversation.yaml` file](https://github.com/dapr/quickstarts/tree/master/conversation/components/conversation.yml) configures the echo mock LLM component.
@ -102,6 +100,8 @@ spec:
version: v1
```
To interface with a real LLM, swap out the mock component with one of [the supported conversation components]({{< ref "supported-conversation" >}}). For example, to use an OpenAI component, see the [example in the conversation how-to guide]({{< ref "howto-conversation-layer.md#use-the-openai-component" >}})
#### `Program.cs` conversation app
In the application code:
@ -197,9 +197,7 @@ dapr run -f .
### What happened?
When you ran `dapr init` during Dapr install, the following YAML files were generated in the `.dapr/components` directory:
- [`dapr.yaml` Multi-App Run template file]({{< ref "#dapryaml-multi-app-run-template-file" >}})
- [`pubsub.yaml` component file]({{< ref "#pubsubyaml-component-file" >}})
When you ran `dapr init` during Dapr install, the [`dapr.yaml` Multi-App Run template file]({{< ref "#dapryaml-multi-app-run-template-file" >}}) was generated in the `.dapr/components` directory.
Running `dapr run -f .` in this Quickstart started [conversation.go]({{< ref "#programcs-conversation-app" >}}).
@ -218,7 +216,7 @@ apps:
command: ["go", "run", "."]
```
#### `conversation.yaml` LLM component
#### Echo mock LLM component
In [`conversation/components`](https://github.com/dapr/quickstarts/tree/master/conversation/components) directly of the quickstart, the [`conversation.yaml` file](https://github.com/dapr/quickstarts/tree/master/conversation/components/conversation.yml) configures the echo LLM component.
@ -232,7 +230,7 @@ spec:
version: v1
```
For authentication, the component also uses a secret store called [`envvar-secrets`](https://github.com/dapr/quickstarts/tree/master/conversation/components/envvar.yml).
To interface with a real LLM, swap out the mock component with one of [the supported conversation components]({{< ref "supported-conversation" >}}). For example, to use an OpenAI component, see the [example in the conversation how-to guide]({{< ref "howto-conversation-layer.md#use-the-openai-component" >}})
#### `conversation.go` conversation app
@ -295,7 +293,7 @@ func main() {
For this example, you will need:
- [Dapr CLI and initialized environment](https://docs.dapr.io/getting-started).
- [.NET SDK or .NET 6 SDK installed](https://dotnet.microsoft.com/download).
- [.NET 8+ SDK installed](https://dotnet.microsoft.com/download).
<!-- IGNORE_LINKS -->
- [Docker Desktop](https://www.docker.com/products/docker-desktop)
<!-- END_IGNORE -->

View File

@ -0,0 +1,39 @@
---
type: docs
title: "DeepSeek"
linkTitle: "DeepSeek"
description: Detailed information on the DeepSeek conversation component
---
## Component format
A Dapr `conversation.yaml` component file has the following structure:
```yaml
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
name: deepseek
spec:
type: conversation.deepseek
metadata:
- name: key
value: mykey
- name: maxTokens
value: 2048
```
{{% alert title="Warning" color="warning" %}}
The above example uses secrets as plain strings. It is recommended to use a secret store for the secrets, as described [here]({{< ref component-secrets.md >}}).
{{% /alert %}}
## Spec metadata fields
| Field | Required | Details | Example |
|--------------------|:--------:|---------|---------|
| `key` | Y | API key for DeepSeek. | `mykey` |
| `maxToken` | N | The max amount of tokens for each request. | `2048` |
## Related links
- [Conversation API overview]({{< ref conversation-overview.md >}})

View File

@ -18,3 +18,8 @@
state: Alpha
version: v1
since: "1.15"
- component: DeepSeek
link: deepseek
state: Alpha
version: v1
since: "1.15"

Binary file not shown.

Before

Width:  |  Height:  |  Size: 203 KiB

After

Width:  |  Height:  |  Size: 178 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 44 KiB

After

Width:  |  Height:  |  Size: 44 KiB