From df912383cab92129a46a015b4f7cfcc3c4f049a3 Mon Sep 17 00:00:00 2001 From: Hannah Hunter <94493363+hhunter-ms@users.noreply.github.com> Date: Fri, 17 Jan 2025 14:15:53 -0500 Subject: [PATCH] removing old paragraph Signed-off-by: Hannah Hunter <94493363+hhunter-ms@users.noreply.github.com> --- .../building-blocks/conversation/conversation-overview.md | 6 ++---- 1 file changed, 2 insertions(+), 4 deletions(-) diff --git a/daprdocs/content/en/developing-applications/building-blocks/conversation/conversation-overview.md b/daprdocs/content/en/developing-applications/building-blocks/conversation/conversation-overview.md index 237314ac3..595870e27 100644 --- a/daprdocs/content/en/developing-applications/building-blocks/conversation/conversation-overview.md +++ b/daprdocs/content/en/developing-applications/building-blocks/conversation/conversation-overview.md @@ -10,12 +10,10 @@ description: "Overview of the conversation API building block" The conversation API is currently in [alpha]({{< ref "certification-lifecycle.md#certification-levels" >}}). {{% /alert %}} -Using the Dapr conversation API, you can reduce the complexity of interacting with Large Language Models (LLMs) and enable critical performance and security functionality with features like prompt caching and personally identifiable information (PII) data obfuscation. +Dapr's conversation API reduces the complexity of securely and reliably interacting with Large Language Models (LLM) at scale. Whether you're a developer who doesn't have the necessary native SDKs or a polyglot shop who just wants to focus on the prompt aspects of LLM interactions, the conversation API provides one consistent API entry point to talk to underlying LLM providers. Diagram showing the flow of a user's app communicating with Dapr's LLM components. -Dapr's conversation API reduces the complexity of securely and reliably interacting with Large Language Models (LLM) at scale. Whether you're a developer who doesn't have the necessary native SDKs or a polyglot shop who just wants to focus on the prompt aspects of LLM interactions, the conversation API provides one consistent API entry point to talk to underlying LLM providers. - In additon to enabling critical performance and security functionality (like [prompt caching]({{< ref "#prompt-caching" >}}) and [PII scrubbing]({{< ref "#personally-identifiable-information-pii-obfuscation" >}})), you can also pair the conversation API with Dapr functionalities, like: - Resiliency circuit breakers and retries to circumvent limit and token errors, or - Middleware to authenticate requests coming to and from the LLM @@ -57,4 +55,4 @@ Want to skip the quickstarts? Not a problem. You can try out the conversation bu ## Next steps - [How-To: Converse with an LLM using the conversation API]({{< ref howto-conversation-layer.md >}}) -- [Conversation API components]({{< ref supported-conversation >}}) \ No newline at end of file +- [Conversation API components]({{< ref supported-conversation >}})