java-sdk/daprdocs/content/en/java-sdk-docs/java-ai/java-ai-howto.md

4.1 KiB
Raw Permalink Blame History

type title linkTitle weight description
docs How to: Author and manage Dapr Conversation AI in the Java SDK How to: Author and manage Conversation AI 20000 How to get up and running with Conversation AI using the Dapr Java SDK

As part of this demonstration, we will look at how to use the Conversation API to converse with a Large Language Model (LLM). The API will return the response from the LLM for the given prompt. With the provided conversation ai example, you will:

  • You will provide a prompt using the Conversation AI example
  • Filter out Personally identifiable information (PII).

This example uses the default configuration from dapr init in self-hosted mode.

Prerequisites

Set up the environment

Clone the Java SDK repo and navigate into it.

git clone https://github.com/dapr/java-sdk.git
cd java-sdk

Run the following command to install the requirements for running the Conversation AI example with the Dapr Java SDK.

mvn clean install -DskipTests

From the Java SDK root directory, navigate to the examples' directory.

cd examples

Run the Dapr sidecar.

dapr run --app-id conversationapp --dapr-grpc-port 51439 --dapr-http-port 3500 --app-port 8080

Now, Dapr is listening for HTTP requests at http://localhost:3500 and gRPC requests at http://localhost:51439.

Send a prompt with Personally identifiable information (PII) to the Conversation AI API

In the DemoConversationAI there are steps to send a prompt using the converse method under the DaprPreviewClient.

public class DemoConversationAI {
  /**
   * The main method to start the client.
   *
   * @param args Input arguments (unused).
   */
  public static void main(String[] args) {
    try (DaprPreviewClient client = new DaprClientBuilder().buildPreviewClient()) {
      System.out.println("Sending the following input to LLM: Hello How are you? This is the my number 672-123-4567");

      ConversationInput daprConversationInput = new ConversationInput("Hello How are you? "
              + "This is the my number 672-123-4567");

      // Component name is the name provided in the metadata block of the conversation.yaml file.
      Mono<ConversationResponse> responseMono = client.converse(new ConversationRequest("echo",
              List.of(daprConversationInput))
              .setContextId("contextId")
              .setScrubPii(true).setTemperature(1.1d));
      ConversationResponse response = responseMono.block();
      System.out.printf("Conversation output: %s", response.getConversationOutputs().get(0).getResult());
    } catch (Exception e) {
      throw new RuntimeException(e);
    }
  }
}

Run the DemoConversationAI with the following command.

java -jar target/dapr-java-sdk-examples-exec.jar io.dapr.examples.conversation.DemoConversationAI

Sample output

== APP == Conversation output: Hello How are you? This is the my number <ISBN>

As shown in the output, the number sent to the API is obfuscated and returned in the form of . The example above uses an "echo" component for testing, which simply returns the input message. When integrated with LLMs like OpenAI or Claude, youll receive meaningful responses instead of echoed input.

Next steps

  • [Learn more about Conversation AI]({{< ref conversation-overview.md >}})
  • [Conversation AI API reference]({{< ref conversation_api.md >}})