dapr-agents/quickstarts/02_llm_call_hugging_face
Yaron Schneider fd28b02935
Add document agent+chainlit quickstart (#96)
* add document agent+chainlit quickstart

Signed-off-by: yaron2 <schneider.yaron@live.com>

* add upload response

Signed-off-by: yaron2 <schneider.yaron@live.com>

---------

Signed-off-by: yaron2 <schneider.yaron@live.com>
2025-04-22 21:41:11 -07:00
..
README.md Hierarchical LLM config on Tasks + workflow/ decorator refactor (#92) 2025-04-22 06:17:36 -07:00
basic.prompty Fixes (#40) 2025-03-11 10:39:23 -07:00
requirements.txt Add document agent+chainlit quickstart (#96) 2025-04-22 21:41:11 -07:00
text_completion.py Hierarchical LLM config on Tasks + workflow/ decorator refactor (#92) 2025-04-22 06:17:36 -07:00

README.md

LLM calls with Hugging Face

This quickstart demonstrates how to use Dapr Agents' LLM capabilities to interact with the Hugging Face Hub language models and generate both free-form text and structured data. You'll learn how to make basic calls to LLMs and how to extract structured information in a type-safe manner.

Prerequisites

  • Python 3.10 (recommended)
  • pip package manager

Environment Setup

# Create a virtual environment
python3.10 -m venv .venv

# Activate the virtual environment 
# On Windows:
.venv\Scripts\activate
# On macOS/Linux:
source .venv/bin/activate

# Install dependencies
pip install -r requirements.txt

Examples

Text

1. Run the basic text completion example:

python text_completion.py

The script demonstrates basic usage of the DaprChatClient for text generation:

from dapr_agents.llm import HFHubChatClient
from dapr_agents.types import UserMessage

from dotenv import load_dotenv

load_dotenv()

# Basic chat completion
llm = HFHubChatClient(
    model="microsoft/Phi-3-mini-4k-instruct"
)
response = llm.generate("Name a famous dog!")

if len(response.get_content()) > 0:
    print("Response: ", response.get_content())

# Chat completion using a prompty file for context
llm = HFHubChatClient.from_prompty('basic.prompty')
response = llm.generate(input_data={"question":"What is your name?"})

if len(response.get_content()) > 0:
    print("Response with prompty: ", response.get_content())

# Chat completion with user input
llm = HFHubChatClient(model="microsoft/Phi-3-mini-4k-instruct")
response = llm.generate(messages=[UserMessage("hello")])

print("Response with user input: ", response.get_content())