{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# LLM: OpenAI Chat Endpoint Basic Examples\n", "\n", "This notebook demonstrates how to use the `OpenAIChatClient` in `dapr-agents` for basic tasks with the OpenAI Chat API. We will explore:\n", "\n", "* Initializing the OpenAI Chat client.\n", "* Generating responses to simple prompts.\n", "* Using a `.prompty` file to provide context/history for enhanced generation." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Install Required Libraries\n", "Before starting, ensure the required libraries are installed:" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "!pip install dapr-agents python-dotenv" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Load Environment Variables\n", "\n", "Load API keys or other configuration values from your `.env` file using `dotenv`." ] }, { "cell_type": "code", "execution_count": 1, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "True" ] }, "execution_count": 1, "metadata": {}, "output_type": "execute_result" } ], "source": [ "from dotenv import load_dotenv\n", "load_dotenv()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Import OpenAIChatClient" ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [], "source": [ "from dapr_agents import OpenAIChatClient" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Basic Chat Completion\n", "\n", "Initialize the `OpenAIChatClient` and generate a response to a simple prompt." ] }, { "cell_type": "code", "execution_count": 3, "metadata": {}, "outputs": [], "source": [ "# Initialize the client\n", "llm = OpenAIChatClient()" ] }, { "cell_type": "code", "execution_count": 4, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "ChatCompletion(choices=[Choice(finish_reason='stop', index=0, message=MessageContent(content='One famous dog is Lassie, the Rough Collie from the television series and films that became iconic for her intelligence and heroic adventures.', role='assistant'), logprobs=None)], created=1741085405, id='chatcmpl-B7K8brL19kn1KgDTG9on7n7ICnt3P', model='gpt-4o-2024-08-06', object='chat.completion', usage={'completion_tokens': 28, 'prompt_tokens': 12, 'total_tokens': 40, 'completion_tokens_details': {'accepted_prediction_tokens': 0, 'audio_tokens': 0, 'reasoning_tokens': 0, 'rejected_prediction_tokens': 0}, 'prompt_tokens_details': {'audio_tokens': 0, 'cached_tokens': 0}})" ] }, "execution_count": 4, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# Generate a response\n", "response = llm.generate('Name a famous dog!')\n", "\n", "# Display the response\n", "response" ] }, { "cell_type": "code", "execution_count": 5, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "{'content': 'One famous dog is Lassie, the Rough Collie from the television series and films that became iconic for her intelligence and heroic adventures.', 'role': 'assistant'}\n" ] } ], "source": [ "print(response.get_message())" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Using a Prompty File for Context\n", "\n", "Use a `.prompty` file to provide context for chat history or additional instructions." ] }, { "cell_type": "code", "execution_count": 6, "metadata": {}, "outputs": [], "source": [ "llm = OpenAIChatClient.from_prompty('basic-openai-chat-history.prompty')" ] }, { "cell_type": "code", "execution_count": 7, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "ChatPromptTemplate(input_variables=['chat_history', 'question'], pre_filled_variables={}, messages=[SystemMessage(content='You are an AI assistant who helps people find information.\\nAs the assistant, you answer questions briefly, succinctly, \\nand in a personable manner using markdown and even add some personal flair with appropriate emojis.\\n\\n{% for item in chat_history %}\\n{{item.role}}:\\n{{item.content}}\\n{% endfor %}', role='system'), UserMessage(content='{{question}}', role='user')], template_format='jinja2')" ] }, "execution_count": 7, "metadata": {}, "output_type": "execute_result" } ], "source": [ "llm.prompt_template" ] }, { "cell_type": "code", "execution_count": 8, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "ChatCompletion(choices=[Choice(finish_reason='stop', index=0, message=MessageContent(content=\"Hey there! I'm your friendly AI assistant. You can call me whatever you'd like, but I don't have a specific name. 😊 How can I help you today?\", role='assistant'), logprobs=None)], created=1741085407, id='chatcmpl-B7K8dI84xY2hjaEspDtJL5EICbSLh', model='gpt-4o-2024-08-06', object='chat.completion', usage={'completion_tokens': 34, 'prompt_tokens': 57, 'total_tokens': 91, 'completion_tokens_details': {'accepted_prediction_tokens': 0, 'audio_tokens': 0, 'reasoning_tokens': 0, 'rejected_prediction_tokens': 0}, 'prompt_tokens_details': {'audio_tokens': 0, 'cached_tokens': 0}})" ] }, "execution_count": 8, "metadata": {}, "output_type": "execute_result" } ], "source": [ "llm.generate(input_data={\"question\":\"What is your name?\"})" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Chat Completion with Messages" ] }, { "cell_type": "code", "execution_count": 9, "metadata": {}, "outputs": [], "source": [ "from dapr_agents.types import UserMessage\n", "\n", "# Initialize the client\n", "llm = OpenAIChatClient()\n", "\n", "# Generate a response using structured messages\n", "response = llm.generate(messages=[UserMessage(\"hello\")])" ] }, { "cell_type": "code", "execution_count": 10, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "{'content': 'Hello! How can I assist you today?', 'role': 'assistant'}\n" ] } ], "source": [ "# Display the structured response\n", "print(response.get_message())" ] }, { "cell_type": "code", "execution_count": 11, "metadata": {}, "outputs": [], "source": [ "llm.prompt_template" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] } ], "metadata": { "kernelspec": { "display_name": ".venv", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.12.1" } }, "nbformat": 4, "nbformat_minor": 2 }