dapr-agents/quickstarts/05-multi-agent-workflow-k8s
Bilgin Ibryam 76ad962b69
Updated references to align with namechange
Signed-off-by: Bilgin Ibryam <bibryam@gmail.com>
2025-05-03 10:02:00 +01:00
..
components Reorganize MCP quickstart examples and add SSE implementation 2025-05-02 21:53:57 +01:00
manifests Reorganize MCP quickstart examples and add SSE implementation 2025-05-02 21:53:57 +01:00
services Reorganize MCP quickstart examples and add SSE implementation 2025-05-02 21:53:57 +01:00
README.md Reorganize MCP quickstart examples and add SSE implementation 2025-05-02 21:53:57 +01:00
docker-compose.yaml Updated references to align with namechange 2025-05-03 10:02:00 +01:00
install.sh Reorganize MCP quickstart examples and add SSE implementation 2025-05-02 21:53:57 +01:00

README.md

Run Multi agent workflows in Kubernetes

This quickstart demonstrates how to create and orchestrate event-driven workflows with multiple autonomous agents using Dapr Agents running on Kubernetes.

Prerequisites

  • Python 3.10 (recommended)
  • Pip package manager
  • OpenAI API key
  • Kind
  • Docker
  • Helm

Configuration

  1. Create a .env file for your API keys:
OPENAI_API_KEY=your_api_key_here

Install through script

The script will:

  1. Install Kind with a local registry
  2. Install Bitnami Redis
  3. Install Dapr
  4. Build the images for 05-multi-agent-workflow-dapr-workflows
  5. Push the images to local in-cluster registry
  6. Install the components for the agents
  7. Create the kubernetes secret form .env file
  8. Deploy the manifests for the agents
  9. Port forward the workload-llm pod on port 8004
  10. Trigger the workflow for getting to Morder by k8s_http_client.py

Install through manifests

First create a secret from your .env file:

kubectl create secret generic openai-secrets --from-env-file=.env --namespace default --dry-run=client -o yaml | kubectl apply -f -

Then build the images locally with docker-compose:

docker-compose -f docker-compose.yaml build --no-cache

Then deploy the manifests:

kubectl apply -f manifests/

Port forward the workload-llm pod:

kubectl port-forward -n default svc/workflow-llm 8004:80 &>/dev/null &

Trigger the client:

python3 services/client/k8s_http_client.py