dapr-agents/quickstarts/06-document-agent-chainlit
Bilgin Ibryam 5d09513220
Fixed docs, and added build and social media links to the readme
Signed-off-by: Bilgin Ibryam <bibryam@gmail.com>
2025-05-01 16:55:11 +01:00
..
components Add document agent+chainlit quickstart (#96) 2025-04-22 21:41:11 -07:00
.gitignore Add document agent+chainlit quickstart (#96) 2025-04-22 21:41:11 -07:00
README.md Fixed docs, and added build and social media links to the readme 2025-05-01 16:55:11 +01:00
app.py Fix/30 add linter action (#95) 2025-04-23 22:58:48 -07:00
red_foxes.pdf Add document agent+chainlit quickstart (#96) 2025-04-22 21:41:11 -07:00
requirements.txt Add document agent+chainlit quickstart (#96) 2025-04-22 21:41:11 -07:00

README.md

A conversational agent over unstructured documents with Chainlit

This quickstart demonstrates how to build a fully functional, enterprise-ready agent that can parse unstructured documents, learn them and converse with users over their contents while remembering all previous interactions. This example also shows how to integrate Dapr with Chainlit, giving users a fully functional chat interface to talk to their agent.

Key Benefits

  • Converse With Unstructured Data: Users can upload documents and have them parsed, contextualized and be made chattable
  • Conversational Memory: The agent maintains context across interactions in the user's database of choice
  • UI Interface: Use an out-of-the-box, LLM-ready chat interface using Chainlit
  • Cloud Agnostic: Uploads are handled automatically by Dapr and can be configured to target different backends

Prerequisites

  • Python 3.10 (recommended)
  • pip package manager
  • OpenAI API key (for the OpenAI example)
  • Dapr CLI installed

Environment Setup

# Create a virtual environment
python3.10 -m venv .venv

# Activate the virtual environment 
# On Windows:
.venv\Scripts\activate
# On macOS/Linux:
source .venv/bin/activate

# Install dependencies
pip install -r requirements.txt

# Initialize Dapr
dapr init

LLM Configuration

For this example, we'll be using the OpenAI client that is used by default. To target different LLMs, see this example.

Create a .env file in the project root:

OPENAI_API_KEY=your_api_key_here

Replace your_api_key_here with your actual OpenAI API key.

File Upload Configuration (Optional)

Dapr will upload your files to a backend of your choice. The default YAML file in ./components/filestorage.yaml targets an S3 bucket, but can be configured to be any of the available Dapr output binding components here.

If you leave the YAML file as-is, the example will run without uploading the file. An error might appear in the console when you upload the file - that's fine and you can ignore it if the storage provider is not configured.

Examples

Upload a PDF and chat to a document agent

Run the agent:

dapr run --app-id doc-agent --resources-path ./components -- chainlit run app.py -w

Wait until the browser opens up. Once open, you're ready to upload any document and start asking questions about it! You can find the agent page at http://localhost:8000.

Upload a PDF of your choice, or use the example red_foxes.pdf file in this example.

Testing the agent's memory

If you exit the app and restart it, the agent will remember all the previously uploaded documents. The documents are stored in the binding component configured in ./components/filestorage.yaml.

When you install Dapr using dapr init, Redis is installed by default and this is where the conversation memory is saved. To change it, edit the ./components/conversationmemory.yaml file.

Summary

How It Works:

  1. Dapr starts, loading the file storage and conversation history storage configs from the components folder.
  2. Chainlit loads and starts the agent UI in your browser.
  3. When a file is uploaded, the contents are parsed and fed to the agent to be able to answer questions.
  4. If the file storage component YAML is correctly configured, Dapr will upload the file to the storage provider.
  5. The conversation history is automatically managed by Dapr and saved in the state store configured in ./components/conversationmemory.yaml.