diff --git a/daprdocs/content/en/developing-applications/building-blocks/conversation/howto-conversation-layer.md b/daprdocs/content/en/developing-applications/building-blocks/conversation/howto-conversation-layer.md index 37cc5764c..7e7fd0fb4 100644 --- a/daprdocs/content/en/developing-applications/building-blocks/conversation/howto-conversation-layer.md +++ b/daprdocs/content/en/developing-applications/building-blocks/conversation/howto-conversation-layer.md @@ -52,8 +52,6 @@ spec: value: - name: model value: gpt-4-turbo - - name: cacheTTL - value: 10m ``` ## Connect the conversation client @@ -114,12 +112,12 @@ func main() { } input := dapr.ConversationInput{ - Message: "Please write a witty haiku about the Dapr distributed programming framework at dapr.io", - // Role: nil, // Optional - // ScrubPII: nil, // Optional + Content: "Please write a witty haiku about the Dapr distributed programming framework at dapr.io", + // Role: "", // Optional + // ScrubPII: false, // Optional } - fmt.Printf("conversation input: %s\n", input.Message) + fmt.Printf("conversation input: %s\n", input.Content) var conversationComponent = "echo" @@ -163,7 +161,7 @@ async fn main() -> Result<(), Box> { let request = ConversationRequestBuilder::new(conversation_component, vec![input.clone()]).build(); - println!("conversation input: {:?}", input.message); + println!("conversation input: {:?}", input.content); let response = client.converse_alpha1(request).await?; @@ -224,6 +222,16 @@ dapr run --app-id=conversation --resources-path ./config --dapr-grpc-port 3500 - {{< /tabs >}} +## Advanced features + +The conversation API supports the following features: + +1. **Prompt caching:** Allows developers to cache prompts in Dapr, leading to much faster response times and reducing costs on egress and on inserting the prompt into the LLM provider's cache. + +1. **PII scrubbing:** Allows for the obfuscation of data going in and out of the LLM. + +To learn how to enable these features, see the [conversation API reference guide]({{< ref conversation_api.md >}}). + ## Related links Try out the conversation API using the full examples provided in the supported SDK repos. diff --git a/daprdocs/content/en/reference/api/conversation_api.md b/daprdocs/content/en/reference/api/conversation_api.md index 7f022134f..44fa52d28 100644 --- a/daprdocs/content/en/reference/api/conversation_api.md +++ b/daprdocs/content/en/reference/api/conversation_api.md @@ -30,40 +30,34 @@ POST http://localhost:/v1.0-alpha1/conversation//converse | Field | Description | | --------- | ----------- | -| `conversationContext` | The ID of an existing chat room (like in ChatGPT). | -| `inputs` | Inputs for the conversation. Multiple inputs at one time are supported. | -| `metadata` | [Metadata](#metadata) passed to conversation components. | +| `inputs` | Inputs for the conversation. Multiple inputs at one time are supported. Required | +| `cacheTTL` | A time-to-live value for a prompt cache to expire. Uses Golang duration format. Optional | +| `scrubPII` | A boolean value to enable obfuscation of sensitive information returning from the LLM. Optional | +| `temperature` | A float value to control the temperature of the model. Used to optimize for consistency and creativity. Optional | +| `metadata` | [Metadata](#metadata) passed to conversation components. Optional | -#### Metadata +#### Input body -Metadata can be sent in the request’s URL. It must be prefixed with `metadata.`, as shown in the table below. - -| Parameter | Description | +| Field | Description | | --------- | ----------- | -| `metadata.key` | The API key for the component. `key` is not applicable to the [AWS Bedrock component]({{< ref "aws-bedrock.md#authenticating-aws" >}}). | -| `metadata.model` | The Large Language Model you're using. Value depends on which conversation component you're using. `model` is not applicable to the [DeepSeek component]({{< ref deepseek.md >}}). | -| `metadata.cacheTTL` | A time-to-live value for a prompt cache to expire. Uses Golang duration format. | +| `content` | The message content to send to the LLM. Required | +| `role` | The role for the LLM to assume. Possible values: 'user', 'tool', 'assistant' | +| `scrubPII` | A boolean value to enable obfuscation of sensitive information present in the content field. Optional | -For example, to call for [Anthropic]({{< ref anthropic.md >}}): - -```bash -curl POST http://localhost:3500/v1.0-alpha1/conversation/anthropic/converse?metadata.key=key1&metadata.model=claude-3-5-sonnet-20240620&metadata.cacheTTL=10m -``` - -{{% alert title="Note" color="primary" %}} -The metadata parameters available depend on the conversation component you use. [See all the supported components for the conversation API.]({{< ref supported-conversation >}}) -{{% /alert %}} - -### Request content +### Request content example ```json REQUEST = { - "inputs": ["what is Dapr", "Why use Dapr"], - "metadata": { - "model": "model-type-based-on-component-used", - "key": "authKey", - "cacheTTL": "10m", - } + "inputs": [ + { + "content": "What is Dapr?", + "role": "user", // Optional + "scrubPII": "true", // Optional. Will obfuscate any sensitive information found in the content field + }, + ], + "cacheTTL": "10m", // Optional + "scrubPII": "true", // Optional. Will obfuscate any sensitive information returning from the LLM + "temperature": 0.5 // Optional. Optimizes for consistency (0) or creativity (1) } ```