Update conversation api (#4546)

* update conversation api

Signed-off-by: yaron2 <schneider.yaron@live.com>

* Update daprdocs/content/en/developing-applications/building-blocks/conversation/howto-conversation-layer.md

Co-authored-by: Hannah Hunter <94493363+hhunter-ms@users.noreply.github.com>
Signed-off-by: Yaron Schneider <schneider.yaron@live.com>

* Update daprdocs/content/en/developing-applications/building-blocks/conversation/howto-conversation-layer.md

Co-authored-by: Hannah Hunter <94493363+hhunter-ms@users.noreply.github.com>
Signed-off-by: Yaron Schneider <schneider.yaron@live.com>

* Update daprdocs/content/en/developing-applications/building-blocks/conversation/howto-conversation-layer.md

Co-authored-by: Hannah Hunter <94493363+hhunter-ms@users.noreply.github.com>
Signed-off-by: Yaron Schneider <schneider.yaron@live.com>

---------

Signed-off-by: yaron2 <schneider.yaron@live.com>
Signed-off-by: Yaron Schneider <schneider.yaron@live.com>
Co-authored-by: Hannah Hunter <94493363+hhunter-ms@users.noreply.github.com>
This commit is contained in:
Yaron Schneider 2025-02-24 10:32:54 -08:00 committed by GitHub
parent da9c8f8842
commit 0b177eccd1
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
2 changed files with 36 additions and 34 deletions

View File

@ -52,8 +52,6 @@ spec:
value: <REPLACE_WITH_YOUR_KEY>
- name: model
value: gpt-4-turbo
- name: cacheTTL
value: 10m
```
## Connect the conversation client
@ -114,12 +112,12 @@ func main() {
}
input := dapr.ConversationInput{
Message: "Please write a witty haiku about the Dapr distributed programming framework at dapr.io",
// Role: nil, // Optional
// ScrubPII: nil, // Optional
Content: "Please write a witty haiku about the Dapr distributed programming framework at dapr.io",
// Role: "", // Optional
// ScrubPII: false, // Optional
}
fmt.Printf("conversation input: %s\n", input.Message)
fmt.Printf("conversation input: %s\n", input.Content)
var conversationComponent = "echo"
@ -163,7 +161,7 @@ async fn main() -> Result<(), Box<dyn std::error::Error>> {
let request =
ConversationRequestBuilder::new(conversation_component, vec![input.clone()]).build();
println!("conversation input: {:?}", input.message);
println!("conversation input: {:?}", input.content);
let response = client.converse_alpha1(request).await?;
@ -224,6 +222,16 @@ dapr run --app-id=conversation --resources-path ./config --dapr-grpc-port 3500 -
{{< /tabs >}}
## Advanced features
The conversation API supports the following features:
1. **Prompt caching:** Allows developers to cache prompts in Dapr, leading to much faster response times and reducing costs on egress and on inserting the prompt into the LLM provider's cache.
1. **PII scrubbing:** Allows for the obfuscation of data going in and out of the LLM.
To learn how to enable these features, see the [conversation API reference guide]({{< ref conversation_api.md >}}).
## Related links
Try out the conversation API using the full examples provided in the supported SDK repos.

View File

@ -30,40 +30,34 @@ POST http://localhost:<daprPort>/v1.0-alpha1/conversation/<llm-name>/converse
| Field | Description |
| --------- | ----------- |
| `conversationContext` | The ID of an existing chat room (like in ChatGPT). |
| `inputs` | Inputs for the conversation. Multiple inputs at one time are supported. |
| `metadata` | [Metadata](#metadata) passed to conversation components. |
| `inputs` | Inputs for the conversation. Multiple inputs at one time are supported. Required |
| `cacheTTL` | A time-to-live value for a prompt cache to expire. Uses Golang duration format. Optional |
| `scrubPII` | A boolean value to enable obfuscation of sensitive information returning from the LLM. Optional |
| `temperature` | A float value to control the temperature of the model. Used to optimize for consistency and creativity. Optional |
| `metadata` | [Metadata](#metadata) passed to conversation components. Optional |
#### Metadata
#### Input body
Metadata can be sent in the requests URL. It must be prefixed with `metadata.`, as shown in the table below.
| Parameter | Description |
| Field | Description |
| --------- | ----------- |
| `metadata.key` | The API key for the component. `key` is not applicable to the [AWS Bedrock component]({{< ref "aws-bedrock.md#authenticating-aws" >}}). |
| `metadata.model` | The Large Language Model you're using. Value depends on which conversation component you're using. `model` is not applicable to the [DeepSeek component]({{< ref deepseek.md >}}). |
| `metadata.cacheTTL` | A time-to-live value for a prompt cache to expire. Uses Golang duration format. |
| `content` | The message content to send to the LLM. Required |
| `role` | The role for the LLM to assume. Possible values: 'user', 'tool', 'assistant' |
| `scrubPII` | A boolean value to enable obfuscation of sensitive information present in the content field. Optional |
For example, to call for [Anthropic]({{< ref anthropic.md >}}):
```bash
curl POST http://localhost:3500/v1.0-alpha1/conversation/anthropic/converse?metadata.key=key1&metadata.model=claude-3-5-sonnet-20240620&metadata.cacheTTL=10m
```
{{% alert title="Note" color="primary" %}}
The metadata parameters available depend on the conversation component you use. [See all the supported components for the conversation API.]({{< ref supported-conversation >}})
{{% /alert %}}
### Request content
### Request content example
```json
REQUEST = {
"inputs": ["what is Dapr", "Why use Dapr"],
"metadata": {
"model": "model-type-based-on-component-used",
"key": "authKey",
"cacheTTL": "10m",
}
"inputs": [
{
"content": "What is Dapr?",
"role": "user", // Optional
"scrubPII": "true", // Optional. Will obfuscate any sensitive information found in the content field
},
],
"cacheTTL": "10m", // Optional
"scrubPII": "true", // Optional. Will obfuscate any sensitive information returning from the LLM
"temperature": 0.5 // Optional. Optimizes for consistency (0) or creativity (1)
}
```