Merge pull request #3 from dapr-sandbox/fix/floki-to-dapr-agents

Migrate floki module to dapr_agents
This commit is contained in:
Roberto Rodriguez 2025-01-25 20:35:15 -08:00 committed by GitHub
commit 37a48c2700
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
167 changed files with 1372 additions and 4101 deletions

View File

@ -1,4 +1,4 @@
# Floki: Agentic Workflows Made Simple
# Dapr Agents: Agentic Workflows Made Simple
[![pypi](https://img.shields.io/pypi/v/floki-ai.svg)](https://pypi.python.org/pypi/floki-ai)
[![PyPI - Downloads](https://img.shields.io/pypi/dm/floki-ai)](https://pypi.org/project/floki-ai/)
@ -7,45 +7,45 @@
![](docs/logo-workflows.png)
> 🚧 Floki is in active development and evolving with ongoing research. APIs and core structures may change as the framework matures and Dapr integration is refined.
> 🚧 Dapr Agents is in active development and evolving with ongoing research. APIs and core structures may change as the framework matures and Dapr integration is refined.
Floki is an open-source framework for researchers and developers to experiment with LLM-based autonomous agents. It provides tools to create, orchestrate, and manage agents while seamlessly connecting to LLM inference APIs. Built on [Dapr](https://docs.dapr.io/), Floki leverages a unified programming model that simplifies microservices and supports both deterministic workflows and event-driven interactions. Using Daprs Virtual Actor pattern, Floki enables agents to function as independent, self-contained units that process messages sequentially, eliminating concurrency concerns while seamlessly integrating into larger workflows. It also facilitates agent collaboration through Daprs Pub/Sub integration, where agents communicate via a shared message bus, simplifying the design of workflows where tasks are distributed efficiently, and agents work together to achieve shared goals. By bringing together these features, Floki provides a powerful way to explore agentic workflows and the components that enable multi-agent systems to collaborate and scale, all powered by Dapr.
Dapr Agents is an open-source framework for researchers and developers to experiment with LLM-based autonomous agents. It provides tools to create, orchestrate, and manage agents while seamlessly connecting to LLM inference APIs. Built on [Dapr](https://docs.dapr.io/), Dapr Agents leverages a unified programming model that simplifies microservices and supports both deterministic workflows and event-driven interactions. Using Daprs Virtual Actor pattern, Dapr Agents enables agents to function as independent, self-contained units that process messages sequentially, eliminating concurrency concerns while seamlessly integrating into larger workflows. It also facilitates agent collaboration through Daprs Pub/Sub integration, where agents communicate via a shared message bus, simplifying the design of workflows where tasks are distributed efficiently, and agents work together to achieve shared goals. By bringing together these features, Dapr Agents provides a powerful way to explore agentic workflows and the components that enable multi-agent systems to collaborate and scale, all powered by Dapr.
## Documentation (WIP 🚧): https://cyb3rward0g.github.io/floki/
## Why Dapr 🎩?
[Dapr](https://docs.dapr.io/) provides Floki with a unified programming model that simplifies the development of resilient and scalable systems by offering built-in APIs for features such as service invocation, Pub/Sub messaging, workflows, and even state management. These components, essential for defining agentic workflows, allow developers to focus on designing agents and workflows rather than rebuilding foundational features. By leveraging Daprs sidecar architecture and portable, event-driven runtime, Floki also enables agents to collaborate effectively, share tasks, and adapt dynamically across cloud and edge environments. This seamless integration brings together deterministic workflows and LLM-based decision-making into a unified system, making it easier to experiment with multi-agent systems and scalable agentic workflows.
[Dapr](https://docs.dapr.io/) provides Dapr Agents with a unified programming model that simplifies the development of resilient and scalable systems by offering built-in APIs for features such as service invocation, Pub/Sub messaging, workflows, and even state management. These components, essential for defining agentic workflows, allow developers to focus on designing agents and workflows rather than rebuilding foundational features. By leveraging Daprs sidecar architecture and portable, event-driven runtime, Dapr Agents also enables agents to collaborate effectively, share tasks, and adapt dynamically across cloud and edge environments. This seamless integration brings together deterministic workflows and LLM-based decision-making into a unified system, making it easier to experiment with multi-agent systems and scalable agentic workflows.
### Key Dapr Features in Floki:
### Key Dapr Features in Dapr Agents:
* 🎯 **Service-to-Service Invocation**: Facilitates direct communication between agents with built-in service discovery, error handling, and distributed tracing. Agents can leverage this for synchronous messaging in multi-agent workflows.
* ⚡️ **Publish and Subscribe**: Supports loosely coupled collaboration between agents through a shared message bus. This enables real-time, event-driven interactions critical for task distribution and coordination.
* 🔄 **Workflow API**: Defines long-running, persistent workflows that combine deterministic processes with LLM-based decision-making. Floki uses this to orchestrate complex multi-step agentic workflows seamlessly.
* 🔄 **Workflow API**: Defines long-running, persistent workflows that combine deterministic processes with LLM-based decision-making. Dapr Agents uses this to orchestrate complex multi-step agentic workflows seamlessly.
* 🧠 **State Management**: Provides a flexible key-value store for agents to retain context across interactions, ensuring continuity and adaptability during workflows.
* 🤖 **Actors**: Implements the Virtual Actor pattern, allowing agents to operate as self-contained, stateful units that handle messages sequentially. This eliminates concurrency concerns and enhances scalability in Floki's agent systems.
* 🤖 **Actors**: Implements the Virtual Actor pattern, allowing agents to operate as self-contained, stateful units that handle messages sequentially. This eliminates concurrency concerns and enhances scalability in Dapr Agents's agent systems.
## Install Floki ⚡️
## Install Dapr Agents ⚡️
Make sure you have Python already installed. `Python >=3.9`
### As a Python package using Pip
```bash
pip install floki-ai
pip install dapr-agents
```
### Remotely from GitHub
```bash
pip install git+https://github.com/Cyb3rWard0g/floki.git
pip install git+https://github.com/dapr-sandbox/dapr-agents.git
```
### From source with `poetry`:
```bash
git clone https://github.com/Cyb3rWard0g/floki
git clone https://github.com/dapr-sandbox/dapr-agents
cd floki
cd dapr-agents
poetry install
```
@ -79,10 +79,10 @@ docker ps
```
## Acknowledgments
Floki was born out of a desire to explore and learn more about [Dapr](https://dapr.io/) and its potential for building agentic systems. I wanted to understand how to deploy agents as services, manage message communication, and connect various components effectively. Along the way, I looked to several established frameworks for ideas and guidance, which helped shape my thinking and approach:
Dapr Agents was born out of a desire to explore and learn more about [Dapr](https://dapr.io/) and its potential for building agentic systems. I wanted to understand how to deploy agents as services, manage message communication, and connect various components effectively. Along the way, I looked to several established frameworks for ideas and guidance, which helped shape my thinking and approach:
* https://github.com/microsoft/autogen
* https://github.com/langchain-ai/langchain
* https://github.com/run-llama/llama_deploy
While these frameworks provided valuable insights, Floki is my unique take on how to leverage Dapr for agent-based workflows and systems. It reflects my learning journey and ongoing research in this exciting space.
While these frameworks provided valuable insights, Dapr Agents is my unique take on how to leverage Dapr for agent-based workflows and systems. It reflects my learning journey and ongoing research in this exciting space.

View File

@ -21,7 +21,7 @@
"metadata": {},
"outputs": [],
"source": [
"!pip install floki-ai"
"!pip install dapr-agents python-dotenv "
]
},
{
@ -37,8 +37,8 @@
"metadata": {},
"outputs": [],
"source": [
"from floki import OpenAPIReActAgent\n",
"from floki.tool.utils import OpenAPISpecParser\n",
"from dapr_agents import OpenAPIReActAgent\n",
"from dapr_agents.tool.utils import OpenAPISpecParser\n",
"from dotenv import load_dotenv\n",
"import logging"
]
@ -202,7 +202,7 @@
"metadata": {},
"outputs": [],
"source": [
"from floki.tool.utils.openapi import openapi_spec_to_openai_fn\n",
"from dapr_agents.tool.utils.openapi import openapi_spec_to_openai_fn\n",
"\n",
"functions = openapi_spec_to_openai_fn(spec_parser)"
]
@ -274,21 +274,21 @@
},
{
"cell_type": "code",
"execution_count": 10,
"execution_count": 12,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:floki.document.embedder.sentence:Loading SentenceTransformer model: all-MiniLM-L6-v2\n",
"INFO:dapr_agents.document.embedder.sentence:Downloading SentenceTransformer model: all-MiniLM-L6-v2\n",
"INFO:sentence_transformers.SentenceTransformer:Load pretrained SentenceTransformer: all-MiniLM-L6-v2\n",
"INFO:floki.document.embedder.sentence:Model loaded successfully.\n"
"INFO:dapr_agents.document.embedder.sentence:Model loaded successfully.\n"
]
}
],
"source": [
"from floki.document.embedder import SentenceTransformerEmbedder\n",
"from dapr_agents.document.embedder import SentenceTransformerEmbedder\n",
"\n",
"embedding_function = SentenceTransformerEmbedder(\n",
" model=\"all-MiniLM-L6-v2\"\n",
@ -304,19 +304,19 @@
},
{
"cell_type": "code",
"execution_count": 11,
"execution_count": 14,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:floki.storage.vectorstores.chroma:ChromaVectorStore initialized with collection: api_toolbox\n"
"INFO:dapr_agents.storage.vectorstores.chroma:ChromaVectorStore initialized with collection: api_toolbox\n"
]
}
],
"source": [
"from floki.storage import ChromaVectorStore\n",
"from dapr_agents.storage import ChromaVectorStore\n",
"\n",
"api_vector_store = ChromaVectorStore(\n",
" name=\"api_toolbox\",\n",
@ -333,23 +333,23 @@
},
{
"cell_type": "code",
"execution_count": 12,
"execution_count": 15,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:floki.llm.openai.client.base:Initializing OpenAI client...\n",
"INFO:floki.agent.patterns.openapi.react:Setting up VectorToolStore for OpenAPIReActAgent...\n",
"INFO:floki.tool.storage.vectorstore:Adding tools to Vector Tool Store.\n",
"INFO:floki.document.embedder.sentence:Generating embeddings for 236 input(s).\n"
"INFO:dapr_agents.llm.openai.client.base:Initializing OpenAI client...\n",
"INFO:dapr_agents.agent.patterns.openapi.react:Setting up VectorToolStore for OpenAPIReActAgent...\n",
"INFO:dapr_agents.tool.storage.vectorstore:Adding tools to Vector Tool Store.\n",
"INFO:dapr_agents.document.embedder.sentence:Generating embeddings for 236 input(s).\n"
]
},
{
"data": {
"application/vnd.jupyter.widget-view+json": {
"model_id": "4a79770bcb3e47729299e8fa6e3a2ef7",
"model_id": "b206f22263d343a8b3976225c67e74b3",
"version_major": 2,
"version_minor": 0
},
@ -364,12 +364,12 @@
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:floki.tool.executor:Tool registered: GetOpenapiDefinition\n",
"INFO:floki.tool.executor:Tool registered: OpenApiCallExecutor\n",
"INFO:floki.tool.executor:Tool Executor initialized with 2 registered tools.\n",
"INFO:floki.agent.base:Constructing system_prompt from agent attributes.\n",
"INFO:floki.agent.base:Using system_prompt to create the prompt template.\n",
"INFO:floki.agent.base:Pre-filled prompt template with attributes: ['name', 'role', 'goal', 'instructions']\n"
"INFO:dapr_agents.tool.executor:Tool registered: GetOpenapiDefinition\n",
"INFO:dapr_agents.tool.executor:Tool registered: OpenApiCallExecutor\n",
"INFO:dapr_agents.tool.executor:Tool Executor initialized with 2 registered tools.\n",
"INFO:dapr_agents.agent.base:Constructing system_prompt from agent attributes.\n",
"INFO:dapr_agents.agent.base:Using system_prompt to create the prompt template.\n",
"INFO:dapr_agents.agent.base:Pre-filled prompt template with attributes: ['name', 'role', 'goal', 'instructions']\n"
]
}
],
@ -391,7 +391,7 @@
},
{
"cell_type": "code",
"execution_count": 13,
"execution_count": 16,
"metadata": {},
"outputs": [
{
@ -463,21 +463,21 @@
},
{
"cell_type": "code",
"execution_count": 14,
"execution_count": 17,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:floki.tool.storage.vectorstore:Searching for tools similar to query: Get information about a user with ID da48bd32-94bd-4263-b23a-5b9820a67fab\n",
"INFO:floki.document.embedder.sentence:Generating embeddings for 1 input(s).\n"
"INFO:dapr_agents.tool.storage.vectorstore:Searching for tools similar to query: Get information about a user with ID da48bd32-94bd-4263-b23a-5b9820a67fab\n",
"INFO:dapr_agents.document.embedder.sentence:Generating embeddings for 1 input(s).\n"
]
},
{
"data": {
"application/vnd.jupyter.widget-view+json": {
"model_id": "6e773cfbd9cd496288d100e64d250984",
"model_id": "da61a88331eb4d61bd6103a3435b3461",
"version_major": 2,
"version_minor": 0
},
@ -491,39 +491,39 @@
{
"data": {
"text/plain": [
"{'ids': [['006fe777-65a9-452e-9cda-c9845af3bf51',\n",
" '5987353c-e148-4c6d-91f2-4aa7b9adb34c',\n",
" '717427a9-2b23-43d6-983f-d3f0dc938dcd',\n",
" 'a49ffecf-58dd-4be9-8798-793d37bcff36']],\n",
"{'ids': [['62ea8a80-b9c2-4939-8a1a-9751a352f0cb',\n",
" '297e584c-ad03-4b58-b043-6d7013c2189f',\n",
" 'cc49bf8b-8396-4eed-a844-c662687d5722',\n",
" 'd2d35002-7aae-4071-aedf-9b7edf0644bc']],\n",
" 'embeddings': None,\n",
" 'documents': [[\"user.DirectReport_GetCountAsUser: None. Args schema: {'type': 'object', 'properties': {'params': {'type': 'object', 'properties': {'$search': {'type': 'string', 'description': 'Search items by search phrases'}, '$filter': {'type': 'string', 'description': 'Filter items by property values'}}, 'required': []}, 'headers': {'type': 'object', 'properties': {'ConsistencyLevel': {'type': 'string', 'description': 'Indicates the requested consistency level. Documentation URL: https://docs.microsoft.com/graph/aad-advanced-queries'}}, 'required': []}, 'path_params': {'type': 'object', 'properties': {'user-id': {'type': 'string', 'description': 'The unique identifier of user'}}, 'required': ['user-id']}}}\",\n",
" \"user_GetDirectReport: The users and contacts that report to the user. (The users and contacts that have their manager property set to this user.) Read-only. Nullable. Supports $expand.. Args schema: {'type': 'object', 'properties': {'params': {'type': 'object', 'properties': {'$select': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Select properties to be returned'}, '$expand': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Expand related entities'}}, 'required': []}, 'headers': {'type': 'object', 'properties': {'ConsistencyLevel': {'type': 'string', 'description': 'Indicates the requested consistency level. Documentation URL: https://docs.microsoft.com/graph/aad-advanced-queries'}}, 'required': []}, 'path_params': {'type': 'object', 'properties': {'user-id': {'type': 'string', 'description': 'The unique identifier of user'}, 'directoryObject-id': {'type': 'string', 'description': 'The unique identifier of directoryObject'}}, 'required': ['user-id', 'directoryObject-id']}}}\",\n",
" \"user.registeredDevice_GetCount: None. Args schema: {'type': 'object', 'properties': {'params': {'type': 'object', 'properties': {'$search': {'type': 'string', 'description': 'Search items by search phrases'}, '$filter': {'type': 'string', 'description': 'Filter items by property values'}}, 'required': []}, 'headers': {'type': 'object', 'properties': {'ConsistencyLevel': {'type': 'string', 'description': 'Indicates the requested consistency level. Documentation URL: https://docs.microsoft.com/graph/aad-advanced-queries'}}, 'required': []}, 'path_params': {'type': 'object', 'properties': {'user-id': {'type': 'string', 'description': 'The unique identifier of user'}}, 'required': ['user-id']}}}\",\n",
" \"user.RegisteredDevice_GetCountAsAppRoleAssignment: None. Args schema: {'type': 'object', 'properties': {'params': {'type': 'object', 'properties': {'$search': {'type': 'string', 'description': 'Search items by search phrases'}, '$filter': {'type': 'string', 'description': 'Filter items by property values'}}, 'required': []}, 'headers': {'type': 'object', 'properties': {'ConsistencyLevel': {'type': 'string', 'description': 'Indicates the requested consistency level. Documentation URL: https://docs.microsoft.com/graph/aad-advanced-queries'}}, 'required': []}, 'path_params': {'type': 'object', 'properties': {'user-id': {'type': 'string', 'description': 'The unique identifier of user'}}, 'required': ['user-id']}}}\",\n",
" \"user_GetDirectReportAsUser: None. Args schema: {'type': 'object', 'properties': {'params': {'type': 'object', 'properties': {'$select': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Select properties to be returned'}, '$expand': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Expand related entities'}}, 'required': []}, 'headers': {'type': 'object', 'properties': {'ConsistencyLevel': {'type': 'string', 'description': 'Indicates the requested consistency level. Documentation URL: https://docs.microsoft.com/graph/aad-advanced-queries'}}, 'required': []}, 'path_params': {'type': 'object', 'properties': {'user-id': {'type': 'string', 'description': 'The unique identifier of user'}, 'directoryObject-id': {'type': 'string', 'description': 'The unique identifier of directoryObject'}}, 'required': ['user-id', 'directoryObject-id']}}}\"]],\n",
" \"user.RegisteredDevice_GetCountAsAppRoleAssignment: None. Args schema: {'type': 'object', 'properties': {'params': {'type': 'object', 'properties': {'$search': {'type': 'string', 'description': 'Search items by search phrases'}, '$filter': {'type': 'string', 'description': 'Filter items by property values'}}, 'required': []}, 'headers': {'type': 'object', 'properties': {'ConsistencyLevel': {'type': 'string', 'description': 'Indicates the requested consistency level. Documentation URL: https://docs.microsoft.com/graph/aad-advanced-queries'}}, 'required': []}, 'path_params': {'type': 'object', 'properties': {'user-id': {'type': 'string', 'description': 'The unique identifier of user'}}, 'required': ['user-id']}}}\"]],\n",
" 'uris': None,\n",
" 'data': None,\n",
" 'metadatas': [[{'method': 'get',\n",
" 'name': 'user.DirectReport_GetCountAsUser',\n",
" 'url': 'https://graph.microsoft.com/v1.0//users/{user-id}/directReports/microsoft.graph.user/$count'},\n",
" {'method': 'get',\n",
" 'name': 'user_GetDirectReport',\n",
" 'url': 'https://graph.microsoft.com/v1.0//users/{user-id}/directReports/{directoryObject-id}'},\n",
" {'method': 'get',\n",
" 'name': 'user.registeredDevice_GetCount',\n",
" 'url': 'https://graph.microsoft.com/v1.0//users/{user-id}/registeredDevices/$count'},\n",
" {'method': 'get',\n",
" 'name': 'user.RegisteredDevice_GetCountAsAppRoleAssignment',\n",
" 'url': 'https://graph.microsoft.com/v1.0//users/{user-id}/registeredDevices/microsoft.graph.appRoleAssignment/$count'},\n",
" {'method': 'get',\n",
" 'name': 'user_GetDirectReportAsUser',\n",
" 'url': 'https://graph.microsoft.com/v1.0//users/{user-id}/directReports/{directoryObject-id}/microsoft.graph.user'}]],\n",
" 'url': 'https://graph.microsoft.com/v1.0//users/{user-id}/registeredDevices/microsoft.graph.appRoleAssignment/$count'}]],\n",
" 'distances': [[0.6506010293960571,\n",
" 0.6595721244812012,\n",
" 0.6822196245193481,\n",
" 0.684098482131958,\n",
" 0.6847845315933228]],\n",
" 0.684098482131958]],\n",
" 'included': [<IncludeEnum.distances: 'distances'>,\n",
" <IncludeEnum.documents: 'documents'>,\n",
" <IncludeEnum.metadatas: 'metadatas'>]}"
]
},
"execution_count": 14,
"execution_count": 17,
"metadata": {},
"output_type": "execute_result"
}
@ -542,16 +542,16 @@
},
{
"cell_type": "code",
"execution_count": 16,
"execution_count": 18,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:floki.agent.patterns.react.base:Iteration 1/10 started.\n",
"INFO:floki.agent.base:Pre-filled prompt template with variables: dict_keys(['chat_history'])\n",
"INFO:floki.llm.openai.chat:Invoking ChatCompletion API.\n"
"INFO:dapr_agents.agent.base:Pre-filled prompt template with variables: dict_keys(['chat_history'])\n",
"INFO:dapr_agents.agent.patterns.react.base:Iteration 1/10 started.\n",
"INFO:dapr_agents.llm.openai.chat:Invoking ChatCompletion API.\n"
]
},
{
@ -570,24 +570,27 @@
"output_type": "stream",
"text": [
"INFO:httpx:HTTP Request: POST https://api.openai.com/v1/chat/completions \"HTTP/1.1 200 OK\"\n",
"INFO:floki.llm.openai.chat:Chat completion retrieved successfully.\n",
"INFO:floki.agent.patterns.react.base:Executing GetOpenapiDefinition with arguments {'user_input': 'get user information by user ID'}\n",
"INFO:floki.tool.executor:Attempting to execute tool: GetOpenapiDefinition\n",
"INFO:floki.tool.storage.vectorstore:Searching for tools similar to query: ['get user information by user ID']\n"
"INFO:dapr_agents.llm.openai.chat:Chat completion retrieved successfully.\n",
"INFO:dapr_agents.agent.patterns.react.base:Executing GetOpenapiDefinition with arguments {'user_input': 'Get user information by user ID'}\n",
"INFO:dapr_agents.tool.executor:Attempting to execute tool: GetOpenapiDefinition\n",
"INFO:dapr_agents.tool.storage.vectorstore:Searching for tools similar to query: ['Get user information by user ID']\n",
"INFO:dapr_agents.document.embedder.sentence:Generating embeddings for 1 input(s).\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"\u001b[38;2;217;95;118mThought: To obtain information about a specific user, I need to identify the correct API that provides user data. I'll start by retrieving potential APIs that can provide user information.\u001b[0m\u001b[0m\n",
"\u001b[38;2;191;69;126mAction: {\"name\": \"GetOpenapiDefinition\", \"arguments\": {\"user_input\": \"get user information by user ID\"}}\u001b[0m\u001b[0m\n"
"\u001b[38;2;217;95;118mThought: To get information about a user using the provided ID, we need to interact with an API that handles user data. I will first locate the appropriate OpenAPI definition to identify the correct API endpoint for retrieving user information. \u001b[0m\n",
"\u001b[38;2;217;95;118m\u001b[0m\n",
"\u001b[38;2;217;95;118mLet's start by using the available tools to find the suitable API.\u001b[0m\u001b[0m\n",
"\u001b[38;2;191;69;126mAction: {\"name\": \"GetOpenapiDefinition\", \"arguments\": {\"user_input\": \"Get user information by user ID\"}}\u001b[0m\u001b[0m\n"
]
},
{
"data": {
"application/vnd.jupyter.widget-view+json": {
"model_id": "4e9011b7f8d84cf696c541df080add26",
"model_id": "c55dabbc4e9e472cb3fe2e0104e20742",
"version_major": 2,
"version_minor": 0
},
@ -602,18 +605,21 @@
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:floki.tool.executor:Tool 'GetOpenapiDefinition' executed successfully.\n",
"INFO:floki.agent.patterns.react.base:To obtain information about a specific user, I need to identify the correct API that provides user data. I'll start by retrieving potential APIs that can provide user information.Observation: [\"user_GetManager: Returns the user or organizational contact assigned as the user's manager. Optionally, you can expand the manager's chain up to the root node.. Args schema: {'type': 'object', 'properties': {'params': {'type': 'object', 'properties': {'$select': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Select properties to be returned'}, '$expand': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Expand related entities'}}, 'required': []}, 'path_params': {'type': 'object', 'properties': {'user-id': {'type': 'string', 'description': 'The unique identifier of user'}}, 'required': ['user-id']}}}\", \"user_GetUser: Read properties and relationships of the user object.. Args schema: {'type': 'object', 'properties': {'params': {'type': 'object', 'properties': {'$select': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Select properties to be returned'}, '$expand': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Expand related entities'}}, 'required': []}, 'headers': {'type': 'object', 'properties': {'If-Match': {'type': 'string', 'description': 'ETag'}}, 'required': []}, 'path_params': {'type': 'object', 'properties': {'user-id': {'type': 'string', 'description': 'The unique identifier of user'}}, 'required': ['user-id']}}}\", \"user_ListOauth2PermissionGrant: Retrieve a list of oAuth2PermissionGrant entities, which represent delegated permissions granted to enable a client application to access an API on behalf of the user.. Args schema: {'type': 'object', 'properties': {'params': {'type': 'object', 'properties': {'$top': {'minimum': 0.0, 'type': 'integer', 'description': 'Show only the first n items'}, '$skip': {'minimum': 0.0, 'type': 'integer', 'description': 'Skip the first n items'}, '$search': {'type': 'string', 'description': 'Search items by search phrases'}, '$filter': {'type': 'string', 'description': 'Filter items by property values'}, '$count': {'type': 'boolean', 'description': 'Include count of items'}, '$orderby': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Order items by property values'}, '$select': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Select properties to be returned'}, '$expand': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Expand related entities'}}, 'required': []}, 'path_params': {'type': 'object', 'properties': {'user-id': {'type': 'string', 'description': 'The unique identifier of user'}}, 'required': ['user-id']}}}\", \"user.oauth2PermissionGrant_GetCount: None. Args schema: {'type': 'object', 'properties': {'params': {'type': 'object', 'properties': {'$search': {'type': 'string', 'description': 'Search items by search phrases'}, '$filter': {'type': 'string', 'description': 'Filter items by property values'}}, 'required': []}, 'path_params': {'type': 'object', 'properties': {'user-id': {'type': 'string', 'description': 'The unique identifier of user'}}, 'required': ['user-id']}}}\", \"user_GetMemberGraphOPreAsGroup: Get groups, directory roles, and administrative units that the user is a direct member of. This operation isn't transitive. To retrieve groups, directory roles, and administrative units that the user is a member through transitive membership, use the List user transitive memberOf API.. Args schema: {'type': 'object', 'properties': {'params': {'type': 'object', 'properties': {'$select': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Select properties to be returned'}, '$expand': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Expand related entities'}}, 'required': []}, 'headers': {'type': 'object', 'properties': {'ConsistencyLevel': {'type': 'string', 'description': 'Indicates the requested consistency level. Documentation URL: https://docs.microsoft.com/graph/aad-advanced-queries'}}, 'required': []}, 'path_params': {'type': 'object', 'properties': {'user-id': {'type': 'string', 'description': 'The unique identifier of user'}, 'directoryObject-id': {'type': 'string', 'description': 'The unique identifier of directoryObject'}}, 'required': ['user-id', 'directoryObject-id']}}}\"]\n",
"INFO:floki.agent.patterns.react.base:Iteration 2/10 started.\n",
"INFO:floki.agent.base:Pre-filled prompt template with variables: dict_keys(['chat_history'])\n",
"INFO:floki.llm.openai.chat:Invoking ChatCompletion API.\n"
"INFO:dapr_agents.tool.executor:Tool 'GetOpenapiDefinition' executed successfully.\n",
"INFO:dapr_agents.agent.patterns.react.base:Thought:To get information about a user using the provided ID, we need to interact with an API that handles user data. I will first locate the appropriate OpenAPI definition to identify the correct API endpoint for retrieving user information. \n",
"\n",
"Let's start by using the available tools to find the suitable API.\n",
"Action:{'name': 'GetOpenapiDefinition', 'arguments': {'user_input': 'Get user information by user ID'}}\n",
"Observation:[\"user_GetManager: Returns the user or organizational contact assigned as the user's manager. Optionally, you can expand the manager's chain up to the root node.. Args schema: {'type': 'object', 'properties': {'params': {'type': 'object', 'properties': {'$select': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Select properties to be returned'}, '$expand': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Expand related entities'}}, 'required': []}, 'path_params': {'type': 'object', 'properties': {'user-id': {'type': 'string', 'description': 'The unique identifier of user'}}, 'required': ['user-id']}}}\", \"user_GetUser: Read properties and relationships of the user object.. Args schema: {'type': 'object', 'properties': {'params': {'type': 'object', 'properties': {'$select': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Select properties to be returned'}, '$expand': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Expand related entities'}}, 'required': []}, 'headers': {'type': 'object', 'properties': {'If-Match': {'type': 'string', 'description': 'ETag'}}, 'required': []}, 'path_params': {'type': 'object', 'properties': {'user-id': {'type': 'string', 'description': 'The unique identifier of user'}}, 'required': ['user-id']}}}\", \"user_GetPhotoContent: The user's profile photo. Read-only.. Args schema: {'type': 'object', 'properties': {'headers': {'type': 'object', 'properties': {'If-Match': {'type': 'string', 'description': 'ETag'}}, 'required': []}, 'path_params': {'type': 'object', 'properties': {'user-id': {'type': 'string', 'description': 'The unique identifier of user'}}, 'required': ['user-id']}}}\", \"user_ListOauth2PermissionGrant: Retrieve a list of oAuth2PermissionGrant entities, which represent delegated permissions granted to enable a client application to access an API on behalf of the user.. Args schema: {'type': 'object', 'properties': {'params': {'type': 'object', 'properties': {'$top': {'minimum': 0.0, 'type': 'integer', 'description': 'Show only the first n items'}, '$skip': {'minimum': 0.0, 'type': 'integer', 'description': 'Skip the first n items'}, '$search': {'type': 'string', 'description': 'Search items by search phrases'}, '$filter': {'type': 'string', 'description': 'Filter items by property values'}, '$count': {'type': 'boolean', 'description': 'Include count of items'}, '$orderby': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Order items by property values'}, '$select': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Select properties to be returned'}, '$expand': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Expand related entities'}}, 'required': []}, 'path_params': {'type': 'object', 'properties': {'user-id': {'type': 'string', 'description': 'The unique identifier of user'}}, 'required': ['user-id']}}}\", \"user_ListDirectReport: The users and contacts that report to the user. (The users and contacts that have their manager property set to this user.) Read-only. Nullable. Supports $expand.. Args schema: {'type': 'object', 'properties': {'params': {'type': 'object', 'properties': {'$top': {'minimum': 0.0, 'type': 'integer', 'description': 'Show only the first n items'}, '$skip': {'minimum': 0.0, 'type': 'integer', 'description': 'Skip the first n items'}, '$search': {'type': 'string', 'description': 'Search items by search phrases'}, '$filter': {'type': 'string', 'description': 'Filter items by property values'}, '$count': {'type': 'boolean', 'description': 'Include count of items'}, '$orderby': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Order items by property values'}, '$select': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Select properties to be returned'}, '$expand': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Expand related entities'}}, 'required': []}, 'headers': {'type': 'object', 'properties': {'ConsistencyLevel': {'type': 'string', 'description': 'Indicates the requested consistency level. Documentation URL: https://docs.microsoft.com/graph/aad-advanced-queries'}}, 'required': []}, 'path_params': {'type': 'object', 'properties': {'user-id': {'type': 'string', 'description': 'The unique identifier of user'}}, 'required': ['user-id']}}}\"]\n",
"INFO:dapr_agents.agent.patterns.react.base:Iteration 2/10 started.\n",
"INFO:dapr_agents.llm.openai.chat:Invoking ChatCompletion API.\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"\u001b[38;2;146;94;130mObservation: [\"user_GetManager: Returns the user or organizational contact assigned as the user's manager. Optionally, you can expand the manager's chain up to the root node.. Args schema: {'type': 'object', 'properties': {'params': {'type': 'object', 'properties': {'$select': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Select properties to be returned'}, '$expand': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Expand related entities'}}, 'required': []}, 'path_params': {'type': 'object', 'properties': {'user-id': {'type': 'string', 'description': 'The unique identifier of user'}}, 'required': ['user-id']}}}\", \"user_GetUser: Read properties and relationships of the user object.. Args schema: {'type': 'object', 'properties': {'params': {'type': 'object', 'properties': {'$select': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Select properties to be returned'}, '$expand': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Expand related entities'}}, 'required': []}, 'headers': {'type': 'object', 'properties': {'If-Match': {'type': 'string', 'description': 'ETag'}}, 'required': []}, 'path_params': {'type': 'object', 'properties': {'user-id': {'type': 'string', 'description': 'The unique identifier of user'}}, 'required': ['user-id']}}}\", \"user_ListOauth2PermissionGrant: Retrieve a list of oAuth2PermissionGrant entities, which represent delegated permissions granted to enable a client application to access an API on behalf of the user.. Args schema: {'type': 'object', 'properties': {'params': {'type': 'object', 'properties': {'$top': {'minimum': 0.0, 'type': 'integer', 'description': 'Show only the first n items'}, '$skip': {'minimum': 0.0, 'type': 'integer', 'description': 'Skip the first n items'}, '$search': {'type': 'string', 'description': 'Search items by search phrases'}, '$filter': {'type': 'string', 'description': 'Filter items by property values'}, '$count': {'type': 'boolean', 'description': 'Include count of items'}, '$orderby': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Order items by property values'}, '$select': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Select properties to be returned'}, '$expand': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Expand related entities'}}, 'required': []}, 'path_params': {'type': 'object', 'properties': {'user-id': {'type': 'string', 'description': 'The unique identifier of user'}}, 'required': ['user-id']}}}\", \"user.oauth2PermissionGrant_GetCount: None. Args schema: {'type': 'object', 'properties': {'params': {'type': 'object', 'properties': {'$search': {'type': 'string', 'description': 'Search items by search phrases'}, '$filter': {'type': 'string', 'description': 'Filter items by property values'}}, 'required': []}, 'path_params': {'type': 'object', 'properties': {'user-id': {'type': 'string', 'description': 'The unique identifier of user'}}, 'required': ['user-id']}}}\", \"user_GetMemberGraphOPreAsGroup: Get groups, directory roles, and administrative units that the user is a direct member of. This operation isn't transitive. To retrieve groups, directory roles, and administrative units that the user is a member through transitive membership, use the List user transitive memberOf API.. Args schema: {'type': 'object', 'properties': {'params': {'type': 'object', 'properties': {'$select': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Select properties to be returned'}, '$expand': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Expand related entities'}}, 'required': []}, 'headers': {'type': 'object', 'properties': {'ConsistencyLevel': {'type': 'string', 'description': 'Indicates the requested consistency level. Documentation URL: https://docs.microsoft.com/graph/aad-advanced-queries'}}, 'required': []}, 'path_params': {'type': 'object', 'properties': {'user-id': {'type': 'string', 'description': 'The unique identifier of user'}, 'directoryObject-id': {'type': 'string', 'description': 'The unique identifier of directoryObject'}}, 'required': ['user-id', 'directoryObject-id']}}}\"]\u001b[0m\u001b[0m\n"
"\u001b[38;2;146;94;130mObservation: [\"user_GetManager: Returns the user or organizational contact assigned as the user's manager. Optionally, you can expand the manager's chain up to the root node.. Args schema: {'type': 'object', 'properties': {'params': {'type': 'object', 'properties': {'$select': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Select properties to be returned'}, '$expand': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Expand related entities'}}, 'required': []}, 'path_params': {'type': 'object', 'properties': {'user-id': {'type': 'string', 'description': 'The unique identifier of user'}}, 'required': ['user-id']}}}\", \"user_GetUser: Read properties and relationships of the user object.. Args schema: {'type': 'object', 'properties': {'params': {'type': 'object', 'properties': {'$select': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Select properties to be returned'}, '$expand': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Expand related entities'}}, 'required': []}, 'headers': {'type': 'object', 'properties': {'If-Match': {'type': 'string', 'description': 'ETag'}}, 'required': []}, 'path_params': {'type': 'object', 'properties': {'user-id': {'type': 'string', 'description': 'The unique identifier of user'}}, 'required': ['user-id']}}}\", \"user_GetPhotoContent: The user's profile photo. Read-only.. Args schema: {'type': 'object', 'properties': {'headers': {'type': 'object', 'properties': {'If-Match': {'type': 'string', 'description': 'ETag'}}, 'required': []}, 'path_params': {'type': 'object', 'properties': {'user-id': {'type': 'string', 'description': 'The unique identifier of user'}}, 'required': ['user-id']}}}\", \"user_ListOauth2PermissionGrant: Retrieve a list of oAuth2PermissionGrant entities, which represent delegated permissions granted to enable a client application to access an API on behalf of the user.. Args schema: {'type': 'object', 'properties': {'params': {'type': 'object', 'properties': {'$top': {'minimum': 0.0, 'type': 'integer', 'description': 'Show only the first n items'}, '$skip': {'minimum': 0.0, 'type': 'integer', 'description': 'Skip the first n items'}, '$search': {'type': 'string', 'description': 'Search items by search phrases'}, '$filter': {'type': 'string', 'description': 'Filter items by property values'}, '$count': {'type': 'boolean', 'description': 'Include count of items'}, '$orderby': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Order items by property values'}, '$select': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Select properties to be returned'}, '$expand': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Expand related entities'}}, 'required': []}, 'path_params': {'type': 'object', 'properties': {'user-id': {'type': 'string', 'description': 'The unique identifier of user'}}, 'required': ['user-id']}}}\", \"user_ListDirectReport: The users and contacts that report to the user. (The users and contacts that have their manager property set to this user.) Read-only. Nullable. Supports $expand.. Args schema: {'type': 'object', 'properties': {'params': {'type': 'object', 'properties': {'$top': {'minimum': 0.0, 'type': 'integer', 'description': 'Show only the first n items'}, '$skip': {'minimum': 0.0, 'type': 'integer', 'description': 'Skip the first n items'}, '$search': {'type': 'string', 'description': 'Search items by search phrases'}, '$filter': {'type': 'string', 'description': 'Filter items by property values'}, '$count': {'type': 'boolean', 'description': 'Include count of items'}, '$orderby': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Order items by property values'}, '$select': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Select properties to be returned'}, '$expand': {'uniqueItems': True, 'type': 'array', 'items': {'type': 'string'}, 'description': 'Expand related entities'}}, 'required': []}, 'headers': {'type': 'object', 'properties': {'ConsistencyLevel': {'type': 'string', 'description': 'Indicates the requested consistency level. Documentation URL: https://docs.microsoft.com/graph/aad-advanced-queries'}}, 'required': []}, 'path_params': {'type': 'object', 'properties': {'user-id': {'type': 'string', 'description': 'The unique identifier of user'}}, 'required': ['user-id']}}}\"]\u001b[0m\u001b[0m\n"
]
},
{
@ -621,19 +627,105 @@
"output_type": "stream",
"text": [
"INFO:httpx:HTTP Request: POST https://api.openai.com/v1/chat/completions \"HTTP/1.1 200 OK\"\n",
"INFO:floki.llm.openai.chat:Chat completion retrieved successfully.\n",
"INFO:floki.agent.patterns.react.base:Executing OpenApiCallExecutor with arguments {'path_template': '/users/{user-id}', 'method': 'GET', 'path_params': {'user-id': 'da48bd32-94bd-4263-b23a-5b9820a67fab'}, 'headers': None, 'params': None}\n",
"INFO:floki.tool.executor:Attempting to execute tool: OpenApiCallExecutor\n"
"INFO:dapr_agents.llm.openai.chat:Chat completion retrieved successfully.\n",
"INFO:dapr_agents.agent.patterns.react.base:Executing OpenApiCallExecutor with arguments {'path_template': 'user_GetUser', 'method': 'GET', 'path_params': {'user-id': 'da48bd32-94bd-4263-b23a-5b9820a67fab'}, 'headers': {}, 'params': None, 'data': {}}\n",
"INFO:dapr_agents.tool.executor:Attempting to execute tool: OpenApiCallExecutor\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"\u001b[38;2;217;95;118mThought: To obtain information about a specific user by their ID, the \"user_GetUser\" endpoint seems to be the most appropriate as it allows you to read properties and relationships of the user object.\u001b[0m\n",
"\u001b[38;2;217;95;118mThought: The `user_GetUser` API endpoint is the most appropriate choice for retrieving information about a user given their user ID. \u001b[0m\n",
"\u001b[38;2;217;95;118m\u001b[0m\n",
"\u001b[38;2;217;95;118mLet's proceed with calling this API.\u001b[0m\u001b[0m\n",
"\u001b[38;2;191;69;126mAction: {\"name\": \"OpenApiCallExecutor\", \"arguments\": {\"path_template\": \"/users/{user-id}\", \"method\": \"GET\", \"path_params\": {\"user-id\": \"da48bd32-94bd-4263-b23a-5b9820a67fab\"}, \"headers\": null, \"params\": null}}\u001b[0m\u001b[0m\n",
"\u001b[38;2;217;95;118mLet's proceed to call this API with the user ID you provided: `da48bd32-94bd-4263-b23a-5b9820a67fab`.\u001b[0m\u001b[0m\n",
"\u001b[38;2;191;69;126mAction: {\"name\": \"OpenApiCallExecutor\", \"arguments\": {\"path_template\": \"user_GetUser\", \"method\": \"GET\", \"path_params\": {\"user-id\": \"da48bd32-94bd-4263-b23a-5b9820a67fab\"}, \"headers\": {}, \"params\": null, \"data\": {}}}\u001b[0m\u001b[0m\n",
"Base Url: https://graph.microsoft.com/v1.0/\n",
"Requested Url: https://graph.microsoft.com/v1.0/user_GetUser\n",
"Requested Parameters: None\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:dapr_agents.tool.executor:Tool 'OpenApiCallExecutor' executed successfully.\n",
"INFO:dapr_agents.agent.patterns.react.base:Thought:The `user_GetUser` API endpoint is the most appropriate choice for retrieving information about a user given their user ID. \n",
"\n",
"Let's proceed to call this API with the user ID you provided: `da48bd32-94bd-4263-b23a-5b9820a67fab`.\n",
"Action:{'name': 'OpenApiCallExecutor', 'arguments': {'path_template': 'user_GetUser', 'method': 'GET', 'path_params': {'user-id': 'da48bd32-94bd-4263-b23a-5b9820a67fab'}, 'headers': {}, 'params': None, 'data': {}}}\n",
"Observation:{'error': {'code': 'BadRequest', 'message': \"Resource not found for the segment 'user_GetUser'.\", 'innerError': {'date': '2025-01-25T23:08:08', 'request-id': 'ad614042-e743-491b-aef4-20c84d70794c', 'client-request-id': 'ad614042-e743-491b-aef4-20c84d70794c'}}}\n",
"INFO:dapr_agents.agent.patterns.react.base:Iteration 3/10 started.\n",
"INFO:dapr_agents.llm.openai.chat:Invoking ChatCompletion API.\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"\u001b[38;2;146;94;130mObservation: {'error': {'code': 'BadRequest', 'message': \"Resource not found for the segment 'user_GetUser'.\", 'innerError': {'date': '2025-01-25T23:08:08', 'request-id': 'ad614042-e743-491b-aef4-20c84d70794c', 'client-request-id': 'ad614042-e743-491b-aef4-20c84d70794c'}}}\u001b[0m\u001b[0m\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:httpx:HTTP Request: POST https://api.openai.com/v1/chat/completions \"HTTP/1.1 200 OK\"\n",
"INFO:dapr_agents.llm.openai.chat:Chat completion retrieved successfully.\n",
"INFO:dapr_agents.agent.patterns.react.base:No action specified; continuing with further reasoning.\n",
"INFO:dapr_agents.agent.patterns.react.base:Iteration 4/10 started.\n",
"INFO:dapr_agents.llm.openai.chat:Invoking ChatCompletion API.\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"\u001b[38;2;217;95;118mThought: The attempt to retrieve user information using the `user_GetUser` endpoint resulted in a \"BadRequest\" error. The issue seems to be related to the segment 'user_GetUser' not being recognized or found.\u001b[0m\n",
"\u001b[38;2;217;95;118m\u001b[0m\n",
"\u001b[38;2;217;95;118mIt is possible that the path template has been incorrectly interpreted or specified. Let's make sure the path is correctly using the OpenAPI definition.\u001b[0m\n",
"\u001b[38;2;217;95;118m\u001b[0m\n",
"\u001b[38;2;217;95;118mI'll re-check the correct path format and try again. Please allow me a moment.\u001b[0m\u001b[0m\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:httpx:HTTP Request: POST https://api.openai.com/v1/chat/completions \"HTTP/1.1 200 OK\"\n",
"INFO:dapr_agents.llm.openai.chat:Chat completion retrieved successfully.\n",
"INFO:dapr_agents.agent.patterns.react.base:No action specified; continuing with further reasoning.\n",
"INFO:dapr_agents.agent.patterns.react.base:Iteration 5/10 started.\n",
"INFO:dapr_agents.llm.openai.chat:Invoking ChatCompletion API.\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"\u001b[38;2;217;95;118mThought: Let me review the available OpenAPI definitions again to confirm the path template for retrieving user information. It's possible I made an error with the interpretation. I will ensure that the request path is accurate.\u001b[0m\n",
"\u001b[38;2;217;95;118m\u001b[0m\n",
"\u001b[38;2;217;95;118mLet's verify and try the correct operation once more.\u001b[0m\u001b[0m\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:httpx:HTTP Request: POST https://api.openai.com/v1/chat/completions \"HTTP/1.1 200 OK\"\n",
"INFO:dapr_agents.llm.openai.chat:Chat completion retrieved successfully.\n",
"INFO:dapr_agents.agent.patterns.react.base:Executing OpenApiCallExecutor with arguments {'path_template': '/users/{user-id}', 'method': 'GET', 'path_params': {'user-id': 'da48bd32-94bd-4263-b23a-5b9820a67fab'}, 'headers': {}, 'params': None, 'data': {}}\n",
"INFO:dapr_agents.tool.executor:Attempting to execute tool: OpenApiCallExecutor\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"\u001b[38;2;217;95;118mThought: I realize that the earlier attempt to use the `user_GetUser` operation failed due to an error in specifying the path. I will correct this by ensuring that the path template and its usage align with standard URI path conventions.\u001b[0m\n",
"\u001b[38;2;217;95;118m\u001b[0m\n",
"\u001b[38;2;217;95;118mLet me re-attempt the API call with the properly formatted path. I need to review and construct the correct API endpoint for execution.\u001b[0m\u001b[0m\n",
"\u001b[38;2;191;69;126mAction: {\"name\": \"OpenApiCallExecutor\", \"arguments\": {\"path_template\": \"/users/{user-id}\", \"method\": \"GET\", \"path_params\": {\"user-id\": \"da48bd32-94bd-4263-b23a-5b9820a67fab\"}, \"headers\": {}, \"params\": null, \"data\": {}}}\u001b[0m\u001b[0m\n",
"Base Url: https://graph.microsoft.com/v1.0/\n",
"Requested Url: https://graph.microsoft.com/v1.0/users/da48bd32-94bd-4263-b23a-5b9820a67fab\n",
"Requested Parameters: None\n"
@ -643,13 +735,14 @@
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:floki.tool.executor:Tool 'OpenApiCallExecutor' executed successfully.\n",
"INFO:floki.agent.patterns.react.base:To obtain information about a specific user by their ID, the \"user_GetUser\" endpoint seems to be the most appropriate as it allows you to read properties and relationships of the user object.\n",
"INFO:dapr_agents.tool.executor:Tool 'OpenApiCallExecutor' executed successfully.\n",
"INFO:dapr_agents.agent.patterns.react.base:Thought:I realize that the earlier attempt to use the `user_GetUser` operation failed due to an error in specifying the path. I will correct this by ensuring that the path template and its usage align with standard URI path conventions.\n",
"\n",
"Let's proceed with calling this API.Observation: {'@odata.context': 'https://graph.microsoft.com/v1.0/$metadata#users/$entity', 'businessPhones': [], 'displayName': 'Support user', 'givenName': None, 'jobTitle': None, 'mail': None, 'mobilePhone': None, 'officeLocation': None, 'preferredLanguage': None, 'surname': None, 'userPrincipalName': 'support@blueteamarsenal.onmicrosoft.com', 'id': 'da48bd32-94bd-4263-b23a-5b9820a67fab'}\n",
"INFO:floki.agent.patterns.react.base:Iteration 3/10 started.\n",
"INFO:floki.agent.base:Pre-filled prompt template with variables: dict_keys(['chat_history'])\n",
"INFO:floki.llm.openai.chat:Invoking ChatCompletion API.\n"
"Let me re-attempt the API call with the properly formatted path. I need to review and construct the correct API endpoint for execution.\n",
"Action:{'name': 'OpenApiCallExecutor', 'arguments': {'path_template': '/users/{user-id}', 'method': 'GET', 'path_params': {'user-id': 'da48bd32-94bd-4263-b23a-5b9820a67fab'}, 'headers': {}, 'params': None, 'data': {}}}\n",
"Observation:{'@odata.context': 'https://graph.microsoft.com/v1.0/$metadata#users/$entity', 'businessPhones': [], 'displayName': 'Support user', 'givenName': None, 'jobTitle': None, 'mail': None, 'mobilePhone': None, 'officeLocation': None, 'preferredLanguage': None, 'surname': None, 'userPrincipalName': 'support@blueteamarsenal.onmicrosoft.com', 'id': 'da48bd32-94bd-4263-b23a-5b9820a67fab'}\n",
"INFO:dapr_agents.agent.patterns.react.base:Iteration 6/10 started.\n",
"INFO:dapr_agents.llm.openai.chat:Invoking ChatCompletion API.\n"
]
},
{
@ -664,20 +757,21 @@
"output_type": "stream",
"text": [
"INFO:httpx:HTTP Request: POST https://api.openai.com/v1/chat/completions \"HTTP/1.1 200 OK\"\n",
"INFO:floki.llm.openai.chat:Chat completion retrieved successfully.\n",
"INFO:floki.agent.patterns.react.base:Agent is responding directly.\n"
"INFO:dapr_agents.llm.openai.chat:Chat completion retrieved successfully.\n",
"INFO:dapr_agents.agent.patterns.react.base:No action specified; continuing with further reasoning.\n",
"INFO:dapr_agents.agent.patterns.react.base:Iteration 7/10 started.\n",
"INFO:dapr_agents.llm.openai.chat:Invoking ChatCompletion API.\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"\u001b[38;2;217;95;118mThought: The information for the user with ID `da48bd32-94bd-4263-b23a-5b9820a67fab` is as follows:\u001b[0m\n",
"\u001b[38;2;217;95;118mThought: I successfully retrieved the information for the user with ID `da48bd32-94bd-4263-b23a-5b9820a67fab`. Here are the details:\u001b[0m\n",
"\u001b[38;2;217;95;118m\u001b[0m\n",
"\u001b[38;2;217;95;118m- **Display Name**: Support user\u001b[0m\n",
"\u001b[38;2;217;95;118m- **User Principal Name**: support@blueteamarsenal.onmicrosoft.com\u001b[0m\n",
"\u001b[38;2;217;95;118m- **Business Phones**: None listed\u001b[0m\n",
"\u001b[38;2;217;95;118m- **Given Name**: Not available\u001b[0m\n",
"\u001b[38;2;217;95;118m- **Business Phones**: []\u001b[0m\n",
"\u001b[38;2;217;95;118m- **Job Title**: Not available\u001b[0m\n",
"\u001b[38;2;217;95;118m- **Mail**: Not available\u001b[0m\n",
"\u001b[38;2;217;95;118m- **Mobile Phone**: Not available\u001b[0m\n",
@ -685,17 +779,44 @@
"\u001b[38;2;217;95;118m- **Preferred Language**: Not available\u001b[0m\n",
"\u001b[38;2;217;95;118m- **Surname**: Not available\u001b[0m\n",
"\u001b[38;2;217;95;118m\u001b[0m\n",
"\u001b[38;2;217;95;118mIf you need further details or to perform additional operations, let me know!\u001b[0m\u001b[0m\n",
"\u001b[38;2;217;95;118mIf you need further details or assistance, feel free to ask!\u001b[0m\u001b[0m\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:httpx:HTTP Request: POST https://api.openai.com/v1/chat/completions \"HTTP/1.1 200 OK\"\n",
"INFO:dapr_agents.llm.openai.chat:Chat completion retrieved successfully.\n",
"INFO:dapr_agents.agent.patterns.react.base:Agent provided a direct final answer.\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"\u001b[38;2;217;95;118mThought: Answer: I successfully retrieved the information for the user with ID `da48bd32-94bd-4263-b23a-5b9820a67fab`. Here are the details:\u001b[0m\n",
"\u001b[38;2;217;95;118m\u001b[0m\n",
"\u001b[38;2;217;95;118m- **Display Name**: Support user\u001b[0m\n",
"\u001b[38;2;217;95;118m- **User Principal Name**: support@blueteamarsenal.onmicrosoft.com\u001b[0m\n",
"\u001b[38;2;217;95;118m- **Business Phones**: []\u001b[0m\n",
"\u001b[38;2;217;95;118m- **Job Title**: Not available\u001b[0m\n",
"\u001b[38;2;217;95;118m- **Mail**: Not available\u001b[0m\n",
"\u001b[38;2;217;95;118m- **Mobile Phone**: Not available\u001b[0m\n",
"\u001b[38;2;217;95;118m- **Office Location**: Not available\u001b[0m\n",
"\u001b[38;2;217;95;118m- **Preferred Language**: Not available\u001b[0m\n",
"\u001b[38;2;217;95;118m- **Surname**: Not available\u001b[0m\n",
"\u001b[38;2;217;95;118m\u001b[0m\n",
"\u001b[38;2;217;95;118mIf you need further details or assistance, feel free to ask!\u001b[0m\u001b[0m\n",
"\u001b[0m\u001b[0m\n",
"\u001b[0m--------------------------------------------------------------------------------\u001b[0m\n",
"\u001b[0m\u001b[0m\u001b[0m\n",
"\u001b[38;2;147;191;183massistant:\u001b[0m\n",
"\u001b[38;2;147;191;183m\u001b[0m\u001b[38;2;147;191;183mThe information for the user with ID `da48bd32-94bd-4263-b23a-5b9820a67fab` is as follows:\u001b[0m\n",
"\u001b[38;2;147;191;183m\u001b[0m\u001b[38;2;147;191;183mI successfully retrieved the information for the user with ID `da48bd32-94bd-4263-b23a-5b9820a67fab`. Here are the details:\u001b[0m\n",
"\u001b[38;2;147;191;183m\u001b[0m\n",
"\u001b[38;2;147;191;183m- **Display Name**: Support user\u001b[0m\n",
"\u001b[38;2;147;191;183m- **User Principal Name**: support@blueteamarsenal.onmicrosoft.com\u001b[0m\n",
"\u001b[38;2;147;191;183m- **Business Phones**: None listed\u001b[0m\n",
"\u001b[38;2;147;191;183m- **Given Name**: Not available\u001b[0m\n",
"\u001b[38;2;147;191;183m- **Business Phones**: []\u001b[0m\n",
"\u001b[38;2;147;191;183m- **Job Title**: Not available\u001b[0m\n",
"\u001b[38;2;147;191;183m- **Mail**: Not available\u001b[0m\n",
"\u001b[38;2;147;191;183m- **Mobile Phone**: Not available\u001b[0m\n",
@ -703,21 +824,22 @@
"\u001b[38;2;147;191;183m- **Preferred Language**: Not available\u001b[0m\n",
"\u001b[38;2;147;191;183m- **Surname**: Not available\u001b[0m\n",
"\u001b[38;2;147;191;183m\u001b[0m\n",
"\u001b[38;2;147;191;183mIf you need further details or to perform additional operations, let me know!\u001b[0m\u001b[0m\n"
"\u001b[38;2;147;191;183mIf you need further details or assistance, feel free to ask!\u001b[0m\u001b[0m\n"
]
},
{
"data": {
"text/plain": [
"'The information for the user with ID `da48bd32-94bd-4263-b23a-5b9820a67fab` is as follows:\\n\\n- **Display Name**: Support user\\n- **User Principal Name**: support@blueteamarsenal.onmicrosoft.com\\n- **Business Phones**: None listed\\n- **Given Name**: Not available\\n- **Job Title**: Not available\\n- **Mail**: Not available\\n- **Mobile Phone**: Not available\\n- **Office Location**: Not available\\n- **Preferred Language**: Not available\\n- **Surname**: Not available\\n\\nIf you need further details or to perform additional operations, let me know!'"
"'I successfully retrieved the information for the user with ID `da48bd32-94bd-4263-b23a-5b9820a67fab`. Here are the details:\\n\\n- **Display Name**: Support user\\n- **User Principal Name**: support@blueteamarsenal.onmicrosoft.com\\n- **Business Phones**: []\\n- **Job Title**: Not available\\n- **Mail**: Not available\\n- **Mobile Phone**: Not available\\n- **Office Location**: Not available\\n- **Preferred Language**: Not available\\n- **Surname**: Not available\\n\\nIf you need further details or assistance, feel free to ask!'"
]
},
"execution_count": 16,
"execution_count": 18,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"prompt = \"Get information about a user with ID da48bd32-94bd-4263-b23a-5b9820a67fab\"\n",
"AIAgent.run(prompt)"
]
},
@ -745,7 +867,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.12.8"
"version": "3.12.1"
}
},
"nbformat": 4,

View File

@ -21,7 +21,7 @@
"metadata": {},
"outputs": [],
"source": [
"!pip install floki-ai"
"!pip install dapr-agents python-dotenv"
]
},
{
@ -65,7 +65,7 @@
"metadata": {},
"outputs": [],
"source": [
"from floki import Agent"
"from dapr_agents import Agent"
]
},
{

View File

@ -21,7 +21,7 @@
"metadata": {},
"outputs": [],
"source": [
"!pip install floki-ai"
"!pip install dapr-agents python-dotenv"
]
},
{
@ -33,7 +33,7 @@
},
{
"cell_type": "code",
"execution_count": 2,
"execution_count": 1,
"metadata": {},
"outputs": [
{
@ -42,7 +42,7 @@
"True"
]
},
"execution_count": 2,
"execution_count": 1,
"metadata": {},
"output_type": "execute_result"
}
@ -61,20 +61,11 @@
},
{
"cell_type": "code",
"execution_count": 3,
"execution_count": 2,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"/Users/wardog/Documents/GitHub/floki/.venv/lib/python3.12/site-packages/pydantic/_internal/_generate_schema.py:777: UserWarning: Mixing V1 models and V2 models (or constructs, like `TypeAdapter`) is not supported. Please upgrade `Settings` to V2.\n",
" warn(\n"
]
}
],
"outputs": [],
"source": [
"from floki import Agent"
"from dapr_agents import Agent"
]
},
{
@ -265,7 +256,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.12.8"
"version": "3.12.1"
}
},
"nbformat": 4,

View File

@ -1,5 +1,5 @@
from typing import Optional
from floki import AgentTool
from dapr_agents import AgentTool
from datetime import datetime
import requests
import time

File diff suppressed because one or more lines are too long

View File

@ -6,7 +6,7 @@
"source": [
"# GraphStore: Neo4j Database Basic Examples\n",
"\n",
"This notebook demonstrates how to use the `Neo4jGraphStore` in `Floki` for basic graph-based tasks. We will explore:\n",
"This notebook demonstrates how to use the `Neo4jGraphStore` in `dapr-agents` for basic graph-based tasks. We will explore:\n",
"\n",
"* Initializing the `Neo4jGraphStore` class.\n",
"* Adding sample nodes.\n",
@ -21,7 +21,7 @@
"source": [
"## Install Required Libraries\n",
"\n",
"Ensure floki and neo4j are installed:"
"Ensure dapr_agents and neo4j are installed:"
]
},
{
@ -30,7 +30,7 @@
"metadata": {},
"outputs": [],
"source": [
"!pip install floki-ai neo4j"
"!pip install dapr-agents python-dotenv neo4j"
]
},
{
@ -97,7 +97,7 @@
"#docker run \\\n",
"#--restart always \\\n",
"#--publish=7474:7474 --publish=7687:7687 \\\n",
"#--env NEO4J_AUTH=neo4j/neo4j \\\n",
"#--env NEO4J_AUTH=neo4j/graphwardog \\\n",
"#--volume=neo4j-data \\\n",
"#--name neo4j-apoc \\\n",
"#--env NEO4J_apoc_export_file_enabled=true \\\n",
@ -118,20 +118,20 @@
},
{
"cell_type": "code",
"execution_count": 3,
"execution_count": 5,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:floki.storage.graphstores.neo4j.client:Successfully created the driver for URI: bolt://localhost:7687\n",
"INFO:floki.storage.graphstores.neo4j.base:Neo4jGraphStore initialized with database neo4j\n"
"INFO:dapr_agents.storage.graphstores.neo4j.client:Successfully created the driver for URI: bolt://localhost:7687\n",
"INFO:dapr_agents.storage.graphstores.neo4j.base:Neo4jGraphStore initialized with database neo4j\n"
]
}
],
"source": [
"from floki.storage.graphstores.neo4j import Neo4jGraphStore\n",
"from dapr_agents.storage.graphstores.neo4j import Neo4jGraphStore\n",
"import os\n",
"\n",
"# Initialize Neo4jGraphStore\n",
@ -145,14 +145,14 @@
},
{
"cell_type": "code",
"execution_count": 4,
"execution_count": 6,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:floki.storage.graphstores.neo4j.client:Connected to Neo4j Kernel version 5.15.0 (community edition)\n"
"INFO:dapr_agents.storage.graphstores.neo4j.client:Connected to Neo4j Kernel version 5.15.0 (community edition)\n"
]
},
{
@ -179,11 +179,27 @@
},
{
"cell_type": "code",
"execution_count": null,
"execution_count": 7,
"metadata": {},
"outputs": [],
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:dapr_agents.storage.graphstores.neo4j.base:Processed batch 1/1\n",
"INFO:dapr_agents.storage.graphstores.neo4j.base:Nodes with label `Person` added successfully.\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Nodes added successfully\n"
]
}
],
"source": [
"from floki.types import Node\n",
"from dapr_agents.types import Node\n",
"\n",
"# Sample nodes\n",
"nodes = [\n",
@ -216,11 +232,28 @@
},
{
"cell_type": "code",
"execution_count": null,
"execution_count": 8,
"metadata": {},
"outputs": [],
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:dapr_agents.storage.graphstores.neo4j.base:Processed batch 1/1\n",
"INFO:neo4j.notifications:Received notification from DBMS server: {severity: INFORMATION} {code: Neo.ClientNotification.Statement.CartesianProduct} {category: PERFORMANCE} {title: This query builds a cartesian product between disconnected patterns.} {description: If a part of a query contains multiple disconnected patterns, this will build a cartesian product between all those parts. This may produce a large amount of data and slow down query processing. While occasionally intended, it may often be possible to reformulate the query that avoids the use of this cross product, perhaps by adding a relationship between the different parts or by using OPTIONAL MATCH (identifier is: (b))} {position: line: 3, column: 25, offset: 45} for query: '\\n UNWIND $data AS rel\\n MATCH (a {id: rel.source_node_id}), (b {id: rel.target_node_id})\\n MERGE (a)-[r:`KNOWS`]->(b)\\n ON CREATE SET r.createdAt = rel.current_time\\n SET r.updatedAt = rel.current_time, r += rel.properties\\n RETURN r\\n '\n",
"INFO:dapr_agents.storage.graphstores.neo4j.base:Relationships of type `KNOWS` added successfully.\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Relationships added successfully\n"
]
}
],
"source": [
"from floki.types import Relationship\n",
"from dapr_agents.types import Relationship\n",
"\n",
"# Sample relationships\n",
"relationships = [\n",
@ -246,14 +279,14 @@
},
{
"cell_type": "code",
"execution_count": 5,
"execution_count": 9,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:floki.storage.graphstores.neo4j.base:Query executed successfully: MATCH (n) RETURN n | Time: 0.01 seconds | Results: 2\n"
"INFO:dapr_agents.storage.graphstores.neo4j.base:Query executed successfully: MATCH (n) RETURN n | Time: 0.05 seconds | Results: 2\n"
]
},
{
@ -261,8 +294,8 @@
"output_type": "stream",
"text": [
"Nodes in the database:\n",
"{'n': {'createdAt': '2025-01-13T06:56:37.251710Z', 'name': 'Alice', 'id': '1', 'age': 30, 'updatedAt': '2025-01-13T06:56:37.251710Z'}}\n",
"{'n': {'createdAt': '2025-01-13T06:56:37.251710Z', 'name': 'Bob', 'id': '2', 'age': 25, 'updatedAt': '2025-01-13T06:56:37.251710Z'}}\n"
"{'n': {'createdAt': '2025-01-25T23:17:18.044482Z', 'name': 'Alice', 'id': '1', 'age': 30, 'updatedAt': '2025-01-25T23:17:18.044482Z'}}\n",
"{'n': {'createdAt': '2025-01-25T23:17:18.044482Z', 'name': 'Bob', 'id': '2', 'age': 25, 'updatedAt': '2025-01-25T23:17:18.044482Z'}}\n"
]
}
],
@ -276,17 +309,17 @@
},
{
"cell_type": "code",
"execution_count": 6,
"execution_count": 10,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:floki.storage.graphstores.neo4j.base:Query executed successfully: \n",
"INFO:dapr_agents.storage.graphstores.neo4j.base:Query executed successfully: \n",
"MATCH (a)-[r]->(b)\n",
"RETURN a.id AS source, b.id AS target, type(r) AS type, properties(r) AS properties\n",
" | Time: 0.01 seconds | Results: 1\n"
" | Time: 0.10 seconds | Results: 1\n"
]
},
{
@ -294,7 +327,7 @@
"output_type": "stream",
"text": [
"Relationships in the database:\n",
"{'source': '1', 'target': '2', 'type': 'KNOWS', 'properties': {'updatedAt': '2025-01-13T07:01:05.566187Z', 'createdAt': '2025-01-13T07:01:05.566187Z', 'since': '2023'}}\n"
"{'source': '1', 'target': '2', 'type': 'KNOWS', 'properties': {'updatedAt': '2025-01-25T23:17:22.428873Z', 'createdAt': '2025-01-25T23:17:22.428873Z', 'since': '2023'}}\n"
]
}
],
@ -311,17 +344,17 @@
},
{
"cell_type": "code",
"execution_count": 7,
"execution_count": 11,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:floki.storage.graphstores.neo4j.base:Query executed successfully: \n",
"INFO:dapr_agents.storage.graphstores.neo4j.base:Query executed successfully: \n",
"MATCH (n)-[r]->(m)\n",
"RETURN n, r, m\n",
" | Time: 0.09 seconds | Results: 1\n"
" | Time: 0.05 seconds | Results: 1\n"
]
},
{
@ -329,7 +362,7 @@
"output_type": "stream",
"text": [
"Nodes and relationships in the database:\n",
"{'n': {'createdAt': '2025-01-13T06:56:37.251710Z', 'name': 'Alice', 'id': '1', 'age': 30, 'updatedAt': '2025-01-13T06:56:37.251710Z'}, 'r': ({'createdAt': '2025-01-13T06:56:37.251710Z', 'name': 'Alice', 'id': '1', 'age': 30, 'updatedAt': '2025-01-13T06:56:37.251710Z'}, 'KNOWS', {'createdAt': '2025-01-13T06:56:37.251710Z', 'name': 'Bob', 'id': '2', 'age': 25, 'updatedAt': '2025-01-13T06:56:37.251710Z'}), 'm': {'createdAt': '2025-01-13T06:56:37.251710Z', 'name': 'Bob', 'id': '2', 'age': 25, 'updatedAt': '2025-01-13T06:56:37.251710Z'}}\n"
"{'n': {'createdAt': '2025-01-25T23:17:18.044482Z', 'name': 'Alice', 'id': '1', 'age': 30, 'updatedAt': '2025-01-25T23:17:18.044482Z'}, 'r': ({'createdAt': '2025-01-25T23:17:18.044482Z', 'name': 'Alice', 'id': '1', 'age': 30, 'updatedAt': '2025-01-25T23:17:18.044482Z'}, 'KNOWS', {'createdAt': '2025-01-25T23:17:18.044482Z', 'name': 'Bob', 'id': '2', 'age': 25, 'updatedAt': '2025-01-25T23:17:18.044482Z'}), 'm': {'createdAt': '2025-01-25T23:17:18.044482Z', 'name': 'Bob', 'id': '2', 'age': 25, 'updatedAt': '2025-01-25T23:17:18.044482Z'}}\n"
]
}
],
@ -353,14 +386,14 @@
},
{
"cell_type": "code",
"execution_count": 8,
"execution_count": 12,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:floki.storage.graphstores.neo4j.base:Database reset successfully\n"
"INFO:dapr_agents.storage.graphstores.neo4j.base:Database reset successfully\n"
]
},
{
@ -378,14 +411,14 @@
},
{
"cell_type": "code",
"execution_count": 9,
"execution_count": 13,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:floki.storage.graphstores.neo4j.base:Query executed successfully: MATCH (n) RETURN n | Time: 0.00 seconds | Results: 0\n"
"INFO:dapr_agents.storage.graphstores.neo4j.base:Query executed successfully: MATCH (n) RETURN n | Time: 0.01 seconds | Results: 0\n"
]
},
{
@ -428,7 +461,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.12.8"
"version": "3.12.1"
}
},
"nbformat": 4,

View File

@ -6,7 +6,7 @@
"source": [
"# LLM: Azure OpenAI Chat Endpoint Basic Examples\n",
"\n",
"This notebook demonstrates how to use the `OpenAIChatClient` in `Floki` for basic tasks with the Azure OpenAI Chat API. We will explore:\n",
"This notebook demonstrates how to use the `OpenAIChatClient` in `dapr-agents` for basic tasks with the Azure OpenAI Chat API. We will explore:\n",
"\n",
"* Initializing the OpenAI Chat client.\n",
"* Generating responses to simple prompts.\n",
@ -27,7 +27,7 @@
"metadata": {},
"outputs": [],
"source": [
"!pip install floki-ai"
"!pip install dapr-agents python-dotenv"
]
},
{
@ -41,7 +41,7 @@
},
{
"cell_type": "code",
"execution_count": 13,
"execution_count": 1,
"metadata": {},
"outputs": [
{
@ -50,7 +50,7 @@
"True"
]
},
"execution_count": 13,
"execution_count": 1,
"metadata": {},
"output_type": "execute_result"
}
@ -69,11 +69,11 @@
},
{
"cell_type": "code",
"execution_count": 14,
"execution_count": 2,
"metadata": {},
"outputs": [],
"source": [
"from floki import OpenAIChatClient"
"from dapr_agents import OpenAIChatClient"
]
},
{
@ -87,7 +87,7 @@
},
{
"cell_type": "code",
"execution_count": 15,
"execution_count": 3,
"metadata": {},
"outputs": [],
"source": [
@ -103,16 +103,16 @@
},
{
"cell_type": "code",
"execution_count": 16,
"execution_count": 4,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"ChatCompletion(choices=[Choice(finish_reason='stop', index=0, message=MessageContent(content='One famous dog is Lassie, the fictional Rough Collie from the novel \"Lassie Come-Home\" by Eric Knight, as well as the classic TV series and movies. Lassie is known for her intelligence, loyalty, and heroism.', role='assistant'), logprobs=None)], created=1736627015, id='chatcmpl-AocJ5PphWonzqhHBy3ftha2QNr2V2', model='gpt-4o-2024-08-06', object='chat.completion', usage={'completion_tokens': 51, 'prompt_tokens': 12, 'total_tokens': 63, 'completion_tokens_details': {'accepted_prediction_tokens': 0, 'audio_tokens': 0, 'reasoning_tokens': 0, 'rejected_prediction_tokens': 0}, 'prompt_tokens_details': {'audio_tokens': 0, 'cached_tokens': 0}})"
"ChatCompletion(choices=[Choice(finish_reason='stop', index=0, message=MessageContent(content='One famous dog is Lassie, the Rough Collie from the classic TV show and movies. Lassie is known for her intelligence, loyalty, and bravery.', role='assistant'), logprobs=None)], created=1737847452, id='chatcmpl-AtjnYMQpvqyUJmnwJX8DKgIfgxmkn', model='gpt-4o-2024-08-06', object='chat.completion', usage={'completion_tokens': 32, 'prompt_tokens': 12, 'total_tokens': 44, 'completion_tokens_details': {'accepted_prediction_tokens': 0, 'audio_tokens': 0, 'reasoning_tokens': 0, 'rejected_prediction_tokens': 0}, 'prompt_tokens_details': {'audio_tokens': 0, 'cached_tokens': 0}})"
]
},
"execution_count": 16,
"execution_count": 4,
"metadata": {},
"output_type": "execute_result"
}
@ -128,14 +128,14 @@
},
{
"cell_type": "code",
"execution_count": 17,
"execution_count": 5,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"{'content': 'One famous dog is Lassie, the fictional Rough Collie from the novel \"Lassie Come-Home\" by Eric Knight, as well as the classic TV series and movies. Lassie is known for her intelligence, loyalty, and heroism.', 'role': 'assistant'}\n"
"{'content': 'One famous dog is Lassie, the Rough Collie from the classic TV show and movies. Lassie is known for her intelligence, loyalty, and bravery.', 'role': 'assistant'}\n"
]
}
],
@ -154,7 +154,7 @@
},
{
"cell_type": "code",
"execution_count": 18,
"execution_count": 6,
"metadata": {},
"outputs": [],
"source": [
@ -163,7 +163,7 @@
},
{
"cell_type": "code",
"execution_count": 19,
"execution_count": 7,
"metadata": {},
"outputs": [
{
@ -172,7 +172,7 @@
"ChatPromptTemplate(input_variables=['question'], pre_filled_variables={}, messages=[SystemMessage(content='You are an AI assistant who helps people find information.\\nAs the assistant, you answer questions briefly, succinctly.', role='system'), UserMessage(content='{{question}}', role='user')], template_format='jinja2')"
]
},
"execution_count": 19,
"execution_count": 7,
"metadata": {},
"output_type": "execute_result"
}
@ -183,16 +183,16 @@
},
{
"cell_type": "code",
"execution_count": 20,
"execution_count": 8,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"ChatCompletion(choices=[Choice(finish_reason='stop', index=0, message=MessageContent(content=\"I am an AI assistant and don't have a personal name, but you can call me Assistant.\", role='assistant'), logprobs=None)], created=1736627018, id='chatcmpl-AocJ8p4NxzUw9hjIXg5YlOVLrbiQW', model='gpt-4o-2024-08-06', object='chat.completion', usage={'completion_tokens': 19, 'prompt_tokens': 39, 'total_tokens': 58, 'completion_tokens_details': {'accepted_prediction_tokens': 0, 'audio_tokens': 0, 'reasoning_tokens': 0, 'rejected_prediction_tokens': 0}, 'prompt_tokens_details': {'audio_tokens': 0, 'cached_tokens': 0}})"
"ChatCompletion(choices=[Choice(finish_reason='stop', index=0, message=MessageContent(content=\"I don't have a personal name, but you can call me Assistant.\", role='assistant'), logprobs=None)], created=1737847479, id='chatcmpl-AtjnzubOoHW5WfJq78mLhmd3aGiOW', model='gpt-4o-2024-08-06', object='chat.completion', usage={'completion_tokens': 14, 'prompt_tokens': 39, 'total_tokens': 53, 'completion_tokens_details': {'accepted_prediction_tokens': 0, 'audio_tokens': 0, 'reasoning_tokens': 0, 'rejected_prediction_tokens': 0}, 'prompt_tokens_details': {'audio_tokens': 0, 'cached_tokens': 0}})"
]
},
"execution_count": 20,
"execution_count": 8,
"metadata": {},
"output_type": "execute_result"
}
@ -210,7 +210,7 @@
},
{
"cell_type": "code",
"execution_count": 21,
"execution_count": 9,
"metadata": {},
"outputs": [],
"source": [
@ -224,7 +224,7 @@
},
{
"cell_type": "code",
"execution_count": 22,
"execution_count": 10,
"metadata": {},
"outputs": [
{
@ -236,7 +236,7 @@
}
],
"source": [
"from floki.types import UserMessage\n",
"from dapr_agents.types import UserMessage\n",
"\n",
"# Generate a response using structured messages\n",
"response = llm.generate(messages=[UserMessage(\"hello\")])\n",
@ -247,7 +247,7 @@
},
{
"cell_type": "code",
"execution_count": 23,
"execution_count": 11,
"metadata": {},
"outputs": [],
"source": [
@ -278,7 +278,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.12.8"
"version": "3.12.1"
}
},
"nbformat": 4,

View File

@ -6,7 +6,7 @@
"source": [
"# LLM: ElevenLabs Text-To-Speech Endpoint Basic Examples\n",
"\n",
"This notebook demonstrates how to use the `ElevenLabsSpeechClient` in Floki for basic tasks with the [ElevenLabs Text-To-Speech Endpoint](https://elevenlabs.io/docs/api-reference/text-to-speech/convert). We will explore:\n",
"This notebook demonstrates how to use the `ElevenLabsSpeechClient` in dapr-agents for basic tasks with the [ElevenLabs Text-To-Speech Endpoint](https://elevenlabs.io/docs/api-reference/text-to-speech/convert). We will explore:\n",
"\n",
"* Initializing the `ElevenLabsSpeechClient`.\n",
"* Generating speech from text and saving it as an MP3 file.."
@ -27,7 +27,7 @@
"metadata": {},
"outputs": [],
"source": [
"!pip install floki-ai elevenlabs"
"!pip install dapr-agents python-dotenv elevenlabs"
]
},
{
@ -39,20 +39,9 @@
},
{
"cell_type": "code",
"execution_count": 1,
"execution_count": null,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"True"
]
},
"execution_count": 1,
"metadata": {},
"output_type": "execute_result"
}
],
"outputs": [],
"source": [
"from dotenv import load_dotenv\n",
"load_dotenv()"
@ -87,20 +76,11 @@
},
{
"cell_type": "code",
"execution_count": 3,
"execution_count": null,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:floki.llm.elevenlabs.client:Initializing ElevenLabs API client...\n",
"INFO:floki.llm.elevenlabs.client:ElevenLabs client initialized successfully.\n"
]
}
],
"outputs": [],
"source": [
"from floki import ElevenLabsSpeechClient\n",
"from dapr_agents import ElevenLabsSpeechClient\n",
"\n",
"client = ElevenLabsSpeechClient(\n",
" model=\"eleven_multilingual_v2\", # Default model\n",
@ -121,19 +101,9 @@
},
{
"cell_type": "code",
"execution_count": 4,
"execution_count": null,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:floki.llm.elevenlabs.speech:Generating speech with voice 'JBFqnCBsd6RMkjVDRZzb', model 'eleven_multilingual_v2'.\n",
"INFO:floki.llm.elevenlabs.speech:Collecting audio bytes.\n",
"INFO:httpx:HTTP Request: POST https://api.elevenlabs.io/v1/text-to-speech/JBFqnCBsd6RMkjVDRZzb?optimize_streaming_latency=0&output_format=mp3_44100_128 \"HTTP/1.1 200 OK\"\n"
]
}
],
"outputs": [],
"source": [
"# Define the text to convert to speech\n",
"text = \"Hello Roberto! This is an example of text-to-speech generation.\"\n",
@ -147,17 +117,9 @@
},
{
"cell_type": "code",
"execution_count": 5,
"execution_count": null,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Audio saved to output_speech.mp3\n"
]
}
],
"outputs": [],
"source": [
"# Save the audio to an MP3 file\n",
"output_path = \"output_speech.mp3\"\n",
@ -178,20 +140,9 @@
},
{
"cell_type": "code",
"execution_count": 6,
"execution_count": null,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:floki.llm.elevenlabs.speech:Generating speech with voice 'JBFqnCBsd6RMkjVDRZzb', model 'eleven_multilingual_v2'.\n",
"INFO:floki.llm.elevenlabs.speech:Saving audio to file: output_speech_auto.mp3 (mode: wb)\n",
"INFO:httpx:HTTP Request: POST https://api.elevenlabs.io/v1/text-to-speech/JBFqnCBsd6RMkjVDRZzb?optimize_streaming_latency=0&output_format=mp3_44100_128 \"HTTP/1.1 200 OK\"\n",
"INFO:floki.llm.elevenlabs.speech:Audio saved to output_speech_auto.mp3\n"
]
}
],
"outputs": [],
"source": [
"# Define the text to convert to speech\n",
"text = \"Hello Roberto! This is another example of text-to-speech generation.\"\n",
@ -228,7 +179,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.12.8"
"version": "3.12.1"
}
},
"nbformat": 4,

View File

@ -6,7 +6,7 @@
"source": [
"# LLM: Hugging Face Chat Endpoint Basic Examples\n",
"\n",
"This notebook demonstrates how to use the `HFHubChatClient` in `Floki` for basic tasks with the Hugging Face Chat API. We will explore:\n",
"This notebook demonstrates how to use the `HFHubChatClient` in `dapr-agents` for basic tasks with the Hugging Face Chat API. We will explore:\n",
"\n",
"* Initializing the Hugging Face Chat client.\n",
"* Generating responses to simple prompts.\n",
@ -27,7 +27,7 @@
"metadata": {},
"outputs": [],
"source": [
"!pip install floki-ai"
"!pip install dapr-agents python-dotenv"
]
},
{
@ -91,7 +91,7 @@
"metadata": {},
"outputs": [],
"source": [
"from floki import HFHubChatClient"
"from dapr_agents import HFHubChatClient"
]
},
{
@ -133,8 +133,8 @@
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:floki.llm.huggingface.chat:Invoking Hugging Face ChatCompletion API.\n",
"INFO:floki.llm.huggingface.chat:Chat completion retrieved successfully.\n"
"INFO:dapr_agents.llm.huggingface.chat:Invoking Hugging Face ChatCompletion API.\n",
"INFO:dapr_agents.llm.huggingface.chat:Chat completion retrieved successfully.\n"
]
}
],
@ -151,7 +151,7 @@
{
"data": {
"text/plain": [
"ChatCompletion(choices=[Choice(finish_reason='length', index=0, message=MessageContent(content='One famous dog is Lassie. Lassie is an American TV dog known for her roles as a heroic and highly affectionate, domestic family pet in several TV movies and television series from 1943 to 1973. The numerous characters in the series were portrayed by several different dogs named Lassie, such as Pal, Silver, Star, Herbie, and Molly. One lesser-known fact about Lassie is that the dog', role='assistant'), logprobs=None)], created=1736270647, id='', model='microsoft/Phi-3-mini-4k-instruct', object='chat.completion', usage={'completion_tokens': 100, 'prompt_tokens': 8, 'total_tokens': 108})"
"ChatCompletion(choices=[Choice(finish_reason='stop', index=0, message=MessageContent(content='One of the most famous dogs in history is Lassie, a fictional character created by Eric Knight. Lassie is an energetic and intelligent collie dog who has appeared in many popular films, television shows, and literature. The original book, \"Lassie Come-Home\" by Eleanor Smith, was published in 1940 and tells the story of a collie pup named Lassie who crosses the English Channel to seek her brother who has been sent to live with new owners in the American West. Today, Lassie\\'s legacy continues through various adaptations and remakes, making her one of the most recognizable and cherished dog characters in entertainment history.', role='assistant'), logprobs=None)], created=1737847799, id='', model='microsoft/Phi-3-mini-4k-instruct', object='chat.completion', usage={'completion_tokens': 150, 'prompt_tokens': 8, 'total_tokens': 158})"
]
},
"execution_count": 6,
@ -172,7 +172,7 @@
{
"data": {
"text/plain": [
"{'content': 'One famous dog is Lassie. Lassie is an American TV dog known for her roles as a heroic and highly affectionate, domestic family pet in several TV movies and television series from 1943 to 1973. The numerous characters in the series were portrayed by several different dogs named Lassie, such as Pal, Silver, Star, Herbie, and Molly. One lesser-known fact about Lassie is that the dog',\n",
"{'content': 'One of the most famous dogs in history is Lassie, a fictional character created by Eric Knight. Lassie is an energetic and intelligent collie dog who has appeared in many popular films, television shows, and literature. The original book, \"Lassie Come-Home\" by Eleanor Smith, was published in 1940 and tells the story of a collie pup named Lassie who crosses the English Channel to seek her brother who has been sent to live with new owners in the American West. Today, Lassie\\'s legacy continues through various adaptations and remakes, making her one of the most recognizable and cherished dog characters in entertainment history.',\n",
" 'role': 'assistant'}"
]
},
@ -193,7 +193,7 @@
{
"data": {
"text/plain": [
"'One famous dog is Lassie. Lassie is an American TV dog known for her roles as a heroic and highly affectionate, domestic family pet in several TV movies and television series from 1943 to 1973. The numerous characters in the series were portrayed by several different dogs named Lassie, such as Pal, Silver, Star, Herbie, and Molly. One lesser-known fact about Lassie is that the dog'"
"'One of the most famous dogs in history is Lassie, a fictional character created by Eric Knight. Lassie is an energetic and intelligent collie dog who has appeared in many popular films, television shows, and literature. The original book, \"Lassie Come-Home\" by Eleanor Smith, was published in 1940 and tells the story of a collie pup named Lassie who crosses the English Channel to seek her brother who has been sent to live with new owners in the American West. Today, Lassie\\'s legacy continues through various adaptations and remakes, making her one of the most recognizable and cherished dog characters in entertainment history.'"
]
},
"execution_count": 8,
@ -232,15 +232,15 @@
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:floki.llm.huggingface.chat:Using prompt template to generate messages.\n",
"INFO:floki.llm.huggingface.chat:Invoking Hugging Face ChatCompletion API.\n",
"INFO:floki.llm.huggingface.chat:Chat completion retrieved successfully.\n"
"INFO:dapr_agents.llm.huggingface.chat:Using prompt template to generate messages.\n",
"INFO:dapr_agents.llm.huggingface.chat:Invoking Hugging Face ChatCompletion API.\n",
"INFO:dapr_agents.llm.huggingface.chat:Chat completion retrieved successfully.\n"
]
},
{
"data": {
"text/plain": [
"ChatCompletion(choices=[Choice(finish_reason='length', index=0, message=MessageContent(content=\"I'm Phi and my purpose here as Microsofts language model GPT-3 developed by MS research team to assist users with their inquiries across various domains of knowledge while ensuring a safe environment for interaction without personal data storage or usage beyond this session unless explicitly permitted otherwise within privacy guidelines set forth during our conversation initiation process which strictly adheres not only legal but also ethical standards including respecting user confidentiality at all times irrespective if it involves sensitive topics like mental health issues where professional help should be sought instead when necessary; however please note that despite these precautions taken towards\", role='assistant'), logprobs=None)], created=1736270655, id='', model='microsoft/Phi-3-mini-4k-instruct', object='chat.completion', usage={'completion_tokens': 128, 'prompt_tokens': 36, 'total_tokens': 164})"
"ChatCompletion(choices=[Choice(finish_reason='length', index=0, message=MessageContent(content='I\\'m Phi and Microsoft calls me Assistant; its a role rather than just having one specific identity like humans do with names such as \"John\" or “Jane”. My purpose revolves around assisting users in finding answers to their queries across various topics including but not limited by science fiction literature analysis! So while there isnt any personal \\'name\\', feel free reach out whenever need assistance navigating through complex narratives of interstellar travel tales written between 1950-23rd century Earth timeframe (or even beyond). Let us embark on this intellectual journey together today - let', role='assistant'), logprobs=None)], created=1737847810, id='', model='microsoft/Phi-3-mini-4k-instruct', object='chat.completion', usage={'completion_tokens': 128, 'prompt_tokens': 36, 'total_tokens': 164})"
]
},
"execution_count": 10,
@ -268,13 +268,13 @@
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:floki.llm.huggingface.chat:Invoking Hugging Face ChatCompletion API.\n",
"INFO:floki.llm.huggingface.chat:Chat completion retrieved successfully.\n"
"INFO:dapr_agents.llm.huggingface.chat:Invoking Hugging Face ChatCompletion API.\n",
"INFO:dapr_agents.llm.huggingface.chat:Chat completion retrieved successfully.\n"
]
}
],
"source": [
"from floki.types import UserMessage\n",
"from dapr_agents.types import UserMessage\n",
"\n",
"# Initialize the client\n",
"llm = HFHubChatClient()\n",
@ -292,7 +292,7 @@
"name": "stdout",
"output_type": "stream",
"text": [
"{'content': \"Hello! How can I assist you today? Whether you need help with a question, want to have a chat, or just want to say hi, I'm here for you. How's your day going so far? 😊\", 'role': 'assistant'}\n"
"{'content': 'Hello! How can I assist you today? If you have any questions, concerns, or just want to chat, feel free! 😊', 'role': 'assistant'}\n"
]
}
],
@ -334,7 +334,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.12.8"
"version": "3.12.1"
}
},
"nbformat": 4,

View File

@ -6,7 +6,7 @@
"source": [
"# LLM: NVIDIA Chat Endpoint Basic Examples\n",
"\n",
"This notebook demonstrates how to use the `NVIDIAChatClient` in `Floki` for basic tasks with the NVIDIA Chat API. We will explore:\n",
"This notebook demonstrates how to use the `NVIDIAChatClient` in `dapr-agents` for basic tasks with the NVIDIA Chat API. We will explore:\n",
"\n",
"* Initializing the `NVIDIAChatClient`.\n",
"* Generating responses to simple prompts.\n",
@ -27,7 +27,7 @@
"metadata": {},
"outputs": [],
"source": [
"!pip install floki-ai"
"!pip install dapr-agents python-dotenv"
]
},
{
@ -73,7 +73,7 @@
"metadata": {},
"outputs": [],
"source": [
"from floki import NVIDIAChatClient"
"from dapr_agents import NVIDIAChatClient"
]
},
{
@ -97,9 +97,20 @@
},
{
"cell_type": "code",
"execution_count": null,
"execution_count": 4,
"metadata": {},
"outputs": [],
"outputs": [
{
"data": {
"text/plain": [
"ChatCompletion(choices=[Choice(finish_reason='stop', index=0, message=MessageContent(content=\"That's an easy one! One of the most famous dogs is probably Laika, the Soviet space dog. She was the first living creature to orbit the Earth, launched into space on November 3, 1957, and paved the way for human spaceflight.\", role='assistant'), logprobs=None)], created=1737847856, id='cmpl-3a55cbd9f8344bb8bbd7603ae83b2088', model='meta/llama3-8b-instruct', object='chat.completion', usage={'completion_tokens': 55, 'prompt_tokens': 15, 'total_tokens': 70, 'completion_tokens_details': None, 'prompt_tokens_details': None})"
]
},
"execution_count": 4,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# Generate a response\n",
"response = llm.generate('Name a famous dog!')\n",
@ -110,9 +121,17 @@
},
{
"cell_type": "code",
"execution_count": null,
"execution_count": 5,
"metadata": {},
"outputs": [],
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"{'content': \"That's an easy one! One of the most famous dogs is probably Laika, the Soviet space dog. She was the first living creature to orbit the Earth, launched into space on November 3, 1957, and paved the way for human spaceflight.\", 'role': 'assistant'}\n"
]
}
],
"source": [
"print(response.get_message())"
]
@ -128,7 +147,7 @@
},
{
"cell_type": "code",
"execution_count": 3,
"execution_count": 6,
"metadata": {},
"outputs": [],
"source": [
@ -137,16 +156,16 @@
},
{
"cell_type": "code",
"execution_count": 4,
"execution_count": 7,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"ChatCompletion(choices=[Choice(finish_reason='stop', index=0, message=MessageContent(content=\"I'm AI Assistant, nice to meet you!\", role='assistant'), logprobs=None)], created=1736492008, id='cmpl-d768cd358d4e48b4a59c1c7814dc5ffc', model='meta/llama3-8b-instruct', object='chat.completion', usage={'completion_tokens': 11, 'prompt_tokens': 43, 'total_tokens': 54, 'completion_tokens_details': None, 'prompt_tokens_details': None})"
"ChatCompletion(choices=[Choice(finish_reason='stop', index=0, message=MessageContent(content=\"I'm AI Assistant, nice to meet you!\", role='assistant'), logprobs=None)], created=1737847868, id='cmpl-abe14ae7edef456da870b7c473bffcc7', model='meta/llama3-8b-instruct', object='chat.completion', usage={'completion_tokens': 11, 'prompt_tokens': 43, 'total_tokens': 54, 'completion_tokens_details': None, 'prompt_tokens_details': None})"
]
},
"execution_count": 4,
"execution_count": 7,
"metadata": {},
"output_type": "execute_result"
}
@ -164,11 +183,11 @@
},
{
"cell_type": "code",
"execution_count": 5,
"execution_count": 8,
"metadata": {},
"outputs": [],
"source": [
"from floki.types import UserMessage\n",
"from dapr_agents.types import UserMessage\n",
"\n",
"# Initialize the client\n",
"llm = NVIDIAChatClient()\n",
@ -179,7 +198,7 @@
},
{
"cell_type": "code",
"execution_count": 6,
"execution_count": 9,
"metadata": {},
"outputs": [
{
@ -197,7 +216,7 @@
},
{
"cell_type": "code",
"execution_count": 7,
"execution_count": 10,
"metadata": {},
"outputs": [],
"source": [
@ -228,7 +247,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.12.8"
"version": "3.12.1"
}
},
"nbformat": 4,

View File

@ -6,7 +6,7 @@
"source": [
"# LLM: NVIDIA Chat Completion with Structured Output\n",
"\n",
"This notebook demonstrates how to use the `NVIDIAChatClient` from `Floki` to generate structured output using `Pydantic` models.\n",
"This notebook demonstrates how to use the `NVIDIAChatClient` from `dapr_agents` to generate structured output using `Pydantic` models.\n",
"\n",
"We will:\n",
"\n",
@ -19,9 +19,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"## Install Required Libraries\n",
"\n",
"Ensure floki and pydantic are installed:"
"## Install Required Libraries"
]
},
{
@ -30,7 +28,7 @@
"metadata": {},
"outputs": [],
"source": [
"!pip install floki-ai"
"!pip install dapr-agents python-dotenv"
]
},
{
@ -85,9 +83,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"## Import Floki Libraries\n",
"\n",
"Import the necessary classes and types from Floki."
"## Import Libraries"
]
},
{
@ -96,8 +92,8 @@
"metadata": {},
"outputs": [],
"source": [
"from floki import NVIDIAChatClient\n",
"from floki.types import UserMessage"
"from dapr_agents import NVIDIAChatClient\n",
"from dapr_agents.types import UserMessage"
]
},
{
@ -111,14 +107,14 @@
},
{
"cell_type": "code",
"execution_count": 8,
"execution_count": 4,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:floki.llm.nvidia.client:Initializing NVIDIA API client...\n"
"INFO:dapr_agents.llm.nvidia.client:Initializing NVIDIA API client...\n"
]
}
],
@ -139,7 +135,7 @@
},
{
"cell_type": "code",
"execution_count": 9,
"execution_count": 5,
"metadata": {},
"outputs": [],
"source": [
@ -162,20 +158,20 @@
},
{
"cell_type": "code",
"execution_count": 10,
"execution_count": 6,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:floki.llm.utils.request:A response model has been passed to structure the response of the LLM.\n",
"INFO:floki.llm.utils.structure:Structured response enabled.\n",
"INFO:floki.llm.nvidia.chat:Invoking ChatCompletion API.\n",
"INFO:dapr_agents.llm.utils.request:A response model has been passed to structure the response of the LLM.\n",
"INFO:dapr_agents.llm.utils.structure:Structured response enabled.\n",
"INFO:dapr_agents.llm.nvidia.chat:Invoking ChatCompletion API.\n",
"INFO:httpx:HTTP Request: POST https://integrate.api.nvidia.com/v1/chat/completions \"HTTP/1.1 200 OK\"\n",
"INFO:floki.llm.nvidia.chat:Chat completion retrieved successfully.\n",
"INFO:floki.llm.utils.response:Structured output was successfully validated.\n",
"INFO:floki.llm.utils.response:Returning an instance of <class '__main__.Dog'>.\n"
"INFO:dapr_agents.llm.nvidia.chat:Chat completion retrieved successfully.\n",
"INFO:dapr_agents.llm.utils.response:Structured output was successfully validated.\n",
"INFO:dapr_agents.llm.utils.response:Returning an instance of <class '__main__.Dog'>.\n"
]
}
],
@ -188,16 +184,16 @@
},
{
"cell_type": "code",
"execution_count": 11,
"execution_count": 7,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"Dog(name='Laika', breed='Mongrel', reason='Space exploration')"
"Dog(name='Laika', breed='Soviet space dog (mixed breeds)', reason='First animal in space')"
]
},
"execution_count": 11,
"execution_count": 7,
"metadata": {},
"output_type": "execute_result"
}
@ -230,7 +226,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.12.8"
"version": "3.12.1"
}
},
"nbformat": 4,

View File

@ -6,7 +6,7 @@
"source": [
"# LLM: NVIDIA Embeddings Endpoint Basic Examples\n",
"\n",
"This notebook demonstrates how to use the `NVIDIAEmbedder` in `Floki` for generating text embeddings. We will explore:\n",
"This notebook demonstrates how to use the `NVIDIAEmbedder` in `dapr-agents` for generating text embeddings. We will explore:\n",
"\n",
"* Initializing the `NVIDIAEmbedder`.\n",
"* Generating embeddings for single and multiple inputs.\n",
@ -27,7 +27,7 @@
"metadata": {},
"outputs": [],
"source": [
"!pip install floki-ai"
"!pip install dapr-agents python-dotenv"
]
},
{
@ -41,7 +41,7 @@
},
{
"cell_type": "code",
"execution_count": 2,
"execution_count": 1,
"metadata": {},
"outputs": [
{
@ -50,7 +50,7 @@
"True"
]
},
"execution_count": 2,
"execution_count": 1,
"metadata": {},
"output_type": "execute_result"
}
@ -69,11 +69,11 @@
},
{
"cell_type": "code",
"execution_count": 3,
"execution_count": 2,
"metadata": {},
"outputs": [],
"source": [
"from floki.document.embedder import NVIDIAEmbedder"
"from dapr_agents.document.embedder import NVIDIAEmbedder"
]
},
{
@ -87,7 +87,7 @@
},
{
"cell_type": "code",
"execution_count": 4,
"execution_count": 3,
"metadata": {},
"outputs": [],
"source": [
@ -108,7 +108,7 @@
},
{
"cell_type": "code",
"execution_count": 5,
"execution_count": 4,
"metadata": {},
"outputs": [
{
@ -141,7 +141,7 @@
},
{
"cell_type": "code",
"execution_count": 6,
"execution_count": 5,
"metadata": {},
"outputs": [
{
@ -179,7 +179,7 @@
},
{
"cell_type": "code",
"execution_count": 7,
"execution_count": 6,
"metadata": {},
"outputs": [
{
@ -207,7 +207,7 @@
},
{
"cell_type": "code",
"execution_count": 8,
"execution_count": 7,
"metadata": {},
"outputs": [
{
@ -252,7 +252,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.12.8"
"version": "3.12.1"
}
},
"nbformat": 4,

View File

@ -6,7 +6,7 @@
"source": [
"# LLM: OpenAI Audio Endpoint Basic Examples\n",
"\n",
"This notebook demonstrates how to use the OpenAIAudioClient in Floki for basic tasks with the OpenAI Audio API. We will explore:\n",
"This notebook demonstrates how to use the `OpenAIAudioClient` in `dapr-agents` for basic tasks with the OpenAI Audio API. We will explore:\n",
"\n",
"* Generating speech from text and saving it as an MP3 file.\n",
"* Transcribing audio to text.\n",
@ -28,7 +28,7 @@
"metadata": {},
"outputs": [],
"source": [
"!pip install floki-ai"
"!pip install dapr-agents python-dotenv"
]
},
{
@ -72,7 +72,7 @@
"metadata": {},
"outputs": [],
"source": [
"from floki import OpenAIAudioClient\n",
"from dapr_agents import OpenAIAudioClient\n",
"\n",
"client = OpenAIAudioClient()"
]
@ -102,7 +102,7 @@
}
],
"source": [
"from floki.types.llm import AudioSpeechRequest\n",
"from dapr_agents.types.llm import AudioSpeechRequest\n",
"\n",
"# Define the text to convert to speech\n",
"text_to_speech = \"Hello Roberto! This is an example of text-to-speech generation.\"\n",
@ -137,11 +137,11 @@
},
{
"cell_type": "code",
"execution_count": 8,
"execution_count": 4,
"metadata": {},
"outputs": [],
"source": [
"from floki.types.llm import AudioSpeechRequest\n",
"from dapr_agents.types.llm import AudioSpeechRequest\n",
"\n",
"# Define the text to convert to speech\n",
"text_to_speech = \"Hola Roberto! Este es otro ejemplo de generacion de voz desde texto.\"\n",
@ -176,7 +176,7 @@
},
{
"cell_type": "code",
"execution_count": 6,
"execution_count": 5,
"metadata": {},
"outputs": [
{
@ -188,7 +188,7 @@
}
],
"source": [
"from floki.types.llm import AudioTranscriptionRequest\n",
"from dapr_agents.types.llm import AudioTranscriptionRequest\n",
"\n",
"# Specify the audio file to transcribe\n",
"audio_file_path = \"output_speech.mp3\"\n",
@ -215,7 +215,7 @@
},
{
"cell_type": "code",
"execution_count": 9,
"execution_count": 6,
"metadata": {},
"outputs": [
{
@ -256,14 +256,14 @@
},
{
"cell_type": "code",
"execution_count": 10,
"execution_count": 7,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Transcription: Hola Roberto, este es otro ejemplo de generación de voz desde texto.\n"
"Transcription: ¡Hola, Roberto! Este es otro ejemplo de generación de voz desde texto.\n"
]
}
],
@ -307,19 +307,19 @@
},
{
"cell_type": "code",
"execution_count": 16,
"execution_count": 8,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Translation: Hello, Roberto. This is another example of voice generation from text.\n"
"Translation: Hola Roberto, este es otro ejemplo de generación de voz desde texto.\n"
]
}
],
"source": [
"from floki.types.llm import AudioTranslationRequest\n",
"from dapr_agents.types.llm import AudioTranslationRequest\n",
"\n",
"# Specify the audio file to translate\n",
"audio_file_path = \"output_speech_spanish_auto.mp3\"\n",
@ -347,14 +347,14 @@
},
{
"cell_type": "code",
"execution_count": 19,
"execution_count": 9,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Translation: Hello, Roberto. This is another example of voice generation from text.\n"
"Translation: Hola Roberto, este es otro ejemplo de generación de voz desde texto.\n"
]
}
],
@ -388,14 +388,14 @@
},
{
"cell_type": "code",
"execution_count": 20,
"execution_count": 10,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Translation: Hello, Roberto. This is another example of voice generation from text.\n"
"Translation: Hola Roberto, este es otro ejemplo de generación de voz desde texto.\n"
]
}
],
@ -445,7 +445,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.12.8"
"version": "3.12.1"
}
},
"nbformat": 4,

View File

@ -6,7 +6,7 @@
"source": [
"# LLM: OpenAI Chat Endpoint Basic Examples\n",
"\n",
"This notebook demonstrates how to use the `OpenAIChatClient` in `Floki` for basic tasks with the OpenAI Chat API. We will explore:\n",
"This notebook demonstrates how to use the `OpenAIChatClient` in `dapr-agents` for basic tasks with the OpenAI Chat API. We will explore:\n",
"\n",
"* Initializing the OpenAI Chat client.\n",
"* Generating responses to simple prompts.\n",
@ -27,7 +27,7 @@
"metadata": {},
"outputs": [],
"source": [
"!pip install floki-ai"
"!pip install dapr-agents python-dotenv"
]
},
{
@ -41,7 +41,7 @@
},
{
"cell_type": "code",
"execution_count": 17,
"execution_count": 1,
"metadata": {},
"outputs": [
{
@ -50,7 +50,7 @@
"True"
]
},
"execution_count": 17,
"execution_count": 1,
"metadata": {},
"output_type": "execute_result"
}
@ -69,11 +69,11 @@
},
{
"cell_type": "code",
"execution_count": 18,
"execution_count": 2,
"metadata": {},
"outputs": [],
"source": [
"from floki import OpenAIChatClient"
"from dapr_agents import OpenAIChatClient"
]
},
{
@ -87,7 +87,7 @@
},
{
"cell_type": "code",
"execution_count": 19,
"execution_count": 3,
"metadata": {},
"outputs": [],
"source": [
@ -97,16 +97,16 @@
},
{
"cell_type": "code",
"execution_count": 20,
"execution_count": 4,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"ChatCompletion(choices=[Choice(finish_reason='stop', index=0, message=MessageContent(content='One famous dog is Lassie, a fictional Rough Collie depicted in books, television shows, and films. Lassie is known for her intelligence and loyalty. Another famous dog is Snoopy from the \"Peanuts\" comic strip, known for his imaginative adventures and unique personality.', role='assistant'), logprobs=None)], created=1736626962, id='chatcmpl-AocIEFoz3HxA4E5Ax2vxzfbjVLh1E', model='gpt-4o-2024-08-06', object='chat.completion', usage={'completion_tokens': 57, 'prompt_tokens': 12, 'total_tokens': 69, 'completion_tokens_details': {'accepted_prediction_tokens': 0, 'audio_tokens': 0, 'reasoning_tokens': 0, 'rejected_prediction_tokens': 0}, 'prompt_tokens_details': {'audio_tokens': 0, 'cached_tokens': 0}})"
"ChatCompletion(choices=[Choice(finish_reason='stop', index=0, message=MessageContent(content='One famous dog is Lassie, a Rough Collie known from the classic television series and movies starting in the 1940s. Lassie became an iconic character and is known for her intelligence and bravery.', role='assistant'), logprobs=None)], created=1737848120, id='chatcmpl-AtjyKAZfSv2IXfiXSRmGpxmIabQRw', model='gpt-4o-2024-08-06', object='chat.completion', usage={'completion_tokens': 43, 'prompt_tokens': 12, 'total_tokens': 55, 'completion_tokens_details': {'accepted_prediction_tokens': 0, 'audio_tokens': 0, 'reasoning_tokens': 0, 'rejected_prediction_tokens': 0}, 'prompt_tokens_details': {'audio_tokens': 0, 'cached_tokens': 0}})"
]
},
"execution_count": 20,
"execution_count": 4,
"metadata": {},
"output_type": "execute_result"
}
@ -121,14 +121,14 @@
},
{
"cell_type": "code",
"execution_count": 21,
"execution_count": 5,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"{'content': 'One famous dog is Lassie, a fictional Rough Collie depicted in books, television shows, and films. Lassie is known for her intelligence and loyalty. Another famous dog is Snoopy from the \"Peanuts\" comic strip, known for his imaginative adventures and unique personality.', 'role': 'assistant'}\n"
"{'content': 'One famous dog is Lassie, a Rough Collie known from the classic television series and movies starting in the 1940s. Lassie became an iconic character and is known for her intelligence and bravery.', 'role': 'assistant'}\n"
]
}
],
@ -147,7 +147,7 @@
},
{
"cell_type": "code",
"execution_count": 22,
"execution_count": 6,
"metadata": {},
"outputs": [],
"source": [
@ -156,7 +156,7 @@
},
{
"cell_type": "code",
"execution_count": 23,
"execution_count": 7,
"metadata": {},
"outputs": [
{
@ -165,7 +165,7 @@
"ChatPromptTemplate(input_variables=['chat_history', 'question'], pre_filled_variables={}, messages=[SystemMessage(content='You are an AI assistant who helps people find information.\\nAs the assistant, you answer questions briefly, succinctly, \\nand in a personable manner using markdown and even add some personal flair with appropriate emojis.\\n\\n{% for item in chat_history %}\\n{{item.role}}:\\n{{item.content}}\\n{% endfor %}', role='system'), UserMessage(content='{{question}}', role='user')], template_format='jinja2')"
]
},
"execution_count": 23,
"execution_count": 7,
"metadata": {},
"output_type": "execute_result"
}
@ -176,16 +176,16 @@
},
{
"cell_type": "code",
"execution_count": 24,
"execution_count": 8,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"ChatCompletion(choices=[Choice(finish_reason='stop', index=0, message=MessageContent(content=\"Hey there! I'm your friendly AI assistant. You can call me whatever you like, but I don't have a specific name. 😊 How can I help you today?\", role='assistant'), logprobs=None)], created=1736626968, id='chatcmpl-AocIKt88klLjv6x2Kb0vLBHEbuOc2', model='gpt-4o-2024-08-06', object='chat.completion', usage={'completion_tokens': 34, 'prompt_tokens': 57, 'total_tokens': 91, 'completion_tokens_details': {'accepted_prediction_tokens': 0, 'audio_tokens': 0, 'reasoning_tokens': 0, 'rejected_prediction_tokens': 0}, 'prompt_tokens_details': {'audio_tokens': 0, 'cached_tokens': 0}})"
"ChatCompletion(choices=[Choice(finish_reason='stop', index=0, message=MessageContent(content=\"Hey there! I'm your friendly AI assistant. You can call me whatever you like, but I don't have a specific name. 😊 How can I help you today?\", role='assistant'), logprobs=None)], created=1737848124, id='chatcmpl-AtjyOHSaZYr2uy3qsP6SB2q4WCXz7', model='gpt-4o-2024-08-06', object='chat.completion', usage={'completion_tokens': 34, 'prompt_tokens': 57, 'total_tokens': 91, 'completion_tokens_details': {'accepted_prediction_tokens': 0, 'audio_tokens': 0, 'reasoning_tokens': 0, 'rejected_prediction_tokens': 0}, 'prompt_tokens_details': {'audio_tokens': 0, 'cached_tokens': 0}})"
]
},
"execution_count": 24,
"execution_count": 8,
"metadata": {},
"output_type": "execute_result"
}
@ -203,11 +203,11 @@
},
{
"cell_type": "code",
"execution_count": 25,
"execution_count": 10,
"metadata": {},
"outputs": [],
"source": [
"from floki.types import UserMessage\n",
"from dapr_agents.types import UserMessage\n",
"\n",
"# Initialize the client\n",
"llm = OpenAIChatClient()\n",
@ -218,7 +218,7 @@
},
{
"cell_type": "code",
"execution_count": 26,
"execution_count": 11,
"metadata": {},
"outputs": [
{
@ -236,7 +236,7 @@
},
{
"cell_type": "code",
"execution_count": 27,
"execution_count": 12,
"metadata": {},
"outputs": [],
"source": [
@ -267,7 +267,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.12.8"
"version": "3.12.1"
}
},
"nbformat": 4,

View File

@ -6,7 +6,7 @@
"source": [
"# LLM: OpenAI Chat Completion with Structured Output\n",
"\n",
"This notebook demonstrates how to use the `OpenAIChatClient` from `Floki` to generate structured output using `Pydantic` models.\n",
"This notebook demonstrates how to use the `OpenAIChatClient` from `dapr-agents` to generate structured output using `Pydantic` models.\n",
"\n",
"We will:\n",
"\n",
@ -20,8 +20,7 @@
"metadata": {},
"source": [
"## Install Required Libraries\n",
"\n",
"Ensure floki and pydantic are installed:"
"Before starting, ensure the required libraries are installed:"
]
},
{
@ -30,7 +29,7 @@
"metadata": {},
"outputs": [],
"source": [
"!pip install floki-ai"
"!pip install dapr-agents python-dotenv"
]
},
{
@ -85,9 +84,9 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"## Import Floki Libraries\n",
"## Import dapr-agents Libraries\n",
"\n",
"Import the necessary classes and types from Floki."
"Import the necessary classes and types from `dapr-agents`."
]
},
{
@ -96,8 +95,8 @@
"metadata": {},
"outputs": [],
"source": [
"from floki import OpenAIChatClient\n",
"from floki.types import UserMessage"
"from dapr_agents import OpenAIChatClient\n",
"from dapr_agents.types import UserMessage"
]
},
{
@ -118,7 +117,7 @@
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:floki.llm.openai.client.base:Initializing OpenAI client...\n"
"INFO:dapr_agents.llm.openai.client.base:Initializing OpenAI client...\n"
]
}
],
@ -167,13 +166,13 @@
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:floki.llm.utils.request:A response model has been passed to structure the response of the LLM.\n",
"INFO:floki.llm.utils.structure:Structured response enabled.\n",
"INFO:floki.llm.openai.chat:Invoking ChatCompletion API.\n",
"INFO:dapr_agents.llm.utils.request:A response model has been passed to structure the response of the LLM.\n",
"INFO:dapr_agents.llm.utils.structure:Structured response enabled.\n",
"INFO:dapr_agents.llm.openai.chat:Invoking ChatCompletion API.\n",
"INFO:httpx:HTTP Request: POST https://api.openai.com/v1/chat/completions \"HTTP/1.1 200 OK\"\n",
"INFO:floki.llm.openai.chat:Chat completion retrieved successfully.\n",
"INFO:floki.llm.utils.response:Structured output was successfully validated.\n",
"INFO:floki.llm.utils.response:Returning an instance of <class '__main__.Dog'>.\n"
"INFO:dapr_agents.llm.openai.chat:Chat completion retrieved successfully.\n",
"INFO:dapr_agents.llm.utils.response:Structured output was successfully validated.\n",
"INFO:dapr_agents.llm.utils.response:Returning an instance of <class '__main__.Dog'>.\n"
]
}
],
@ -192,7 +191,7 @@
{
"data": {
"text/plain": [
"Dog(name='Balto', breed='Siberian Husky', reason='Balto was a sled dog who became famous for his role in the 1925 serum run to Nome, Alaska, where he helped deliver a life-saving diphtheria serum across treacherous terrain under severe weather conditions.')"
"Dog(name='Laika', breed='Siberian Husky mix', reason='Laika was the first dog to orbit the Earth and the first animal to fly in space, paving the way for human spaceflight.')"
]
},
"execution_count": 7,
@ -203,6 +202,13 @@
"source": [
"response"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
@ -221,7 +227,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.12.8"
"version": "3.12.1"
}
},
"nbformat": 4,

View File

@ -6,7 +6,7 @@
"source": [
"# LLM: OpenAI Embeddings Endpoint Basic Examples\n",
"\n",
"This notebook demonstrates how to use the `OpenAIEmbedder` in `Floki` for generating text embeddings. We will explore:\n",
"This notebook demonstrates how to use the `OpenAIEmbedder` in `dapr-agents` for generating text embeddings. We will explore:\n",
"\n",
"* Initializing the `OpenAIEmbedder`.\n",
"* Generating embeddings for single and multiple inputs.\n",
@ -27,7 +27,7 @@
"metadata": {},
"outputs": [],
"source": [
"!pip install floki-ai"
"!pip install dapr-agents python-dotenv tiktoken"
]
},
{
@ -41,7 +41,7 @@
},
{
"cell_type": "code",
"execution_count": 8,
"execution_count": 1,
"metadata": {},
"outputs": [
{
@ -50,7 +50,7 @@
"True"
]
},
"execution_count": 8,
"execution_count": 1,
"metadata": {},
"output_type": "execute_result"
}
@ -69,11 +69,11 @@
},
{
"cell_type": "code",
"execution_count": 9,
"execution_count": 2,
"metadata": {},
"outputs": [],
"source": [
"from floki.document.embedder import OpenAIEmbedder"
"from dapr_agents.document.embedder import OpenAIEmbedder"
]
},
{
@ -87,7 +87,7 @@
},
{
"cell_type": "code",
"execution_count": 10,
"execution_count": 5,
"metadata": {},
"outputs": [],
"source": [
@ -110,14 +110,14 @@
},
{
"cell_type": "code",
"execution_count": 11,
"execution_count": 6,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Embedding (first 5 values): [0.001670185, 0.00602331, -0.015077059, -0.008596678, -0.011558244]\n"
"Embedding (first 5 values): [0.0015723939, 0.005963983, -0.015102495, -0.008559333, -0.011583589]\n"
]
}
],
@ -143,15 +143,15 @@
},
{
"cell_type": "code",
"execution_count": 12,
"execution_count": 7,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Text 1 embedding (first 5 values): [0.0015723939, 0.005963983, -0.015102495, -0.008559333, -0.011583589]\n",
"Text 2 embedding (first 5 values): [0.03261204, -0.020966679, 0.0026475298, -0.009384127, -0.007305047]\n"
"Text 1 embedding (first 5 values): [0.0016219332, 0.0060211923, -0.015054546, -0.008541764, -0.011522614]\n",
"Text 2 embedding (first 5 values): [0.0326601, -0.020990396, 0.0026253697, -0.009408621, -0.0072794342]\n"
]
}
],
@ -181,7 +181,7 @@
},
{
"cell_type": "code",
"execution_count": 13,
"execution_count": 8,
"metadata": {},
"outputs": [
{
@ -209,7 +209,7 @@
},
{
"cell_type": "code",
"execution_count": 14,
"execution_count": 9,
"metadata": {},
"outputs": [
{
@ -254,7 +254,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.12.8"
"version": "3.12.1"
}
},
"nbformat": 4,

View File

@ -6,7 +6,7 @@
"source": [
"# VectorStore: Chroma and OpenAI Embeddings Basic Examples\n",
"\n",
"This notebook demonstrates how to use the `ChromaVectorStore` in `Floki` for storing, querying, and filtering documents. We will explore:\n",
"This notebook demonstrates how to use the `ChromaVectorStore` in `dapr-agents` for storing, querying, and filtering documents. We will explore:\n",
"\n",
"* Initializing the `OpenAIEmbedder` embedding function and `ChromaVectorStore`.\n",
"* Adding documents with text and metadata.\n",
@ -31,7 +31,7 @@
"metadata": {},
"outputs": [],
"source": [
"!pip install floki-ai chromadb"
"!pip install dapr-agents python-dotenv chromadb"
]
},
{
@ -79,7 +79,7 @@
"metadata": {},
"outputs": [],
"source": [
"from floki.document.embedder import OpenAIEmbedder\n",
"from dapr_agents.document.embedder import OpenAIEmbedder\n",
"\n",
"embedding_funciton = OpenAIEmbedder(\n",
" model = \"text-embedding-ada-002\",\n",
@ -98,11 +98,11 @@
},
{
"cell_type": "code",
"execution_count": 4,
"execution_count": 3,
"metadata": {},
"outputs": [],
"source": [
"from floki.storage import ChromaVectorStore\n",
"from dapr_agents.storage import ChromaVectorStore\n",
"\n",
"# Initialize ChromaVectorStore\n",
"store = ChromaVectorStore(\n",
@ -131,11 +131,11 @@
},
{
"cell_type": "code",
"execution_count": 5,
"execution_count": 4,
"metadata": {},
"outputs": [],
"source": [
"from floki.types.document import Document\n",
"from dapr_agents.types.document import Document\n",
"\n",
"# Example Lord of the Rings-inspired conversations\n",
"documents = [\n",
@ -191,7 +191,7 @@
},
{
"cell_type": "code",
"execution_count": 6,
"execution_count": 5,
"metadata": {},
"outputs": [
{
@ -218,7 +218,7 @@
},
{
"cell_type": "code",
"execution_count": 7,
"execution_count": 6,
"metadata": {},
"outputs": [
{
@ -226,16 +226,16 @@
"output_type": "stream",
"text": [
"Retrieved documents:\n",
"ID: 3193af8f-6bb8-45d7-a08c-dfd08b162b21, Text: Gandalf: A wizard is never late, Frodo Baggins. Nor is he early; he arrives precisely when he means to., Metadata: {'location': 'The Shire', 'topic': 'wisdom'}\n",
"ID: 164c9a47-9716-4171-bee9-ab51f2ede2dd, Text: Frodo: I wish the Ring had never come to me. I wish none of this had happened., Metadata: {'location': 'Moria', 'topic': 'destiny'}\n",
"ID: 1ad06853-6170-431b-bcc5-979d9e415762, Text: Aragorn: You cannot wield it! None of us can. The One Ring answers to Sauron alone. It has no other master., Metadata: {'location': 'Rivendell', 'topic': 'power'}\n",
"ID: 8fd5373c-e143-4e80-b5c5-6ceb5cce5d9a, Text: Sam: I can't carry it for you, but I can carry you!, Metadata: {'location': 'Mount Doom', 'topic': 'friendship'}\n",
"ID: 8b226088-b002-4d7b-87ad-0484e00bad36, Text: Legolas: A red sun rises. Blood has been spilled this night., Metadata: {'location': 'Rohan', 'topic': 'war'}\n",
"ID: 7514fad3-90ba-491f-89d1-a21cddc6f66a, Text: Gimli: Certainty of death. Small chance of success. What are we waiting for?, Metadata: {'location': \"Helm's Deep\", 'topic': 'bravery'}\n",
"ID: fbc9dd2f-f752-4fa3-8f74-fb6a9f167fae, Text: Boromir: One does not simply walk into Mordor., Metadata: {'location': 'Rivendell', 'topic': 'impossible tasks'}\n",
"ID: 81d2d903-3a7c-46e7-b55c-b93fffb01dbb, Text: Galadriel: Even the smallest person can change the course of the future., Metadata: {'location': 'Lothlórien', 'topic': 'hope'}\n",
"ID: 2f3d4555-e12f-4e50-9bbe-82bc9935e953, Text: Théoden: So it begins., Metadata: {'location': \"Helm's Deep\", 'topic': 'battle'}\n",
"ID: 75cd422b-2a7a-43c3-9de4-e26724dcbd01, Text: Elrond: The strength of the Ring-bearer is failing. In his heart, Frodo begins to understand. The quest will claim his life., Metadata: {'location': 'Rivendell', 'topic': 'sacrifice'}\n"
"ID: e92f2785-9044-44ca-8198-1d60797bea24, Text: Gandalf: A wizard is never late, Frodo Baggins. Nor is he early; he arrives precisely when he means to., Metadata: {'location': 'The Shire', 'topic': 'wisdom'}\n",
"ID: 1c465f18-77c6-4eb0-a926-6e56ed219d0b, Text: Frodo: I wish the Ring had never come to me. I wish none of this had happened., Metadata: {'location': 'Moria', 'topic': 'destiny'}\n",
"ID: 852104a9-d201-4139-a341-2ad8d016600d, Text: Aragorn: You cannot wield it! None of us can. The One Ring answers to Sauron alone. It has no other master., Metadata: {'location': 'Rivendell', 'topic': 'power'}\n",
"ID: cee8e4ce-2f57-4cd6-bbdf-2eac5d52535f, Text: Sam: I can't carry it for you, but I can carry you!, Metadata: {'location': 'Mount Doom', 'topic': 'friendship'}\n",
"ID: 5e9e0252-4981-4cb9-b1d8-a11d8f080cd1, Text: Legolas: A red sun rises. Blood has been spilled this night., Metadata: {'location': 'Rohan', 'topic': 'war'}\n",
"ID: 1f781c5e-858a-4c96-98b0-74643d99f9a3, Text: Gimli: Certainty of death. Small chance of success. What are we waiting for?, Metadata: {'location': \"Helm's Deep\", 'topic': 'bravery'}\n",
"ID: c8e1f523-9dbd-42f9-9fab-6e3bf4a47234, Text: Boromir: One does not simply walk into Mordor., Metadata: {'location': 'Rivendell', 'topic': 'impossible tasks'}\n",
"ID: ad64b341-7098-4601-8605-91d9e1756a11, Text: Galadriel: Even the smallest person can change the course of the future., Metadata: {'location': 'Lothlórien', 'topic': 'hope'}\n",
"ID: c899e9c6-f2d3-4c2e-8e4c-347fbfd1df0a, Text: Théoden: So it begins., Metadata: {'location': \"Helm's Deep\", 'topic': 'battle'}\n",
"ID: cbaf0bfe-6a22-47d7-8911-1addbc146768, Text: Elrond: The strength of the Ring-bearer is failing. In his heart, Frodo begins to understand. The quest will claim his life., Metadata: {'location': 'Rivendell', 'topic': 'sacrifice'}\n"
]
}
],
@ -258,14 +258,14 @@
},
{
"cell_type": "code",
"execution_count": 8,
"execution_count": 7,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Updated document: [{'id': '3193af8f-6bb8-45d7-a08c-dfd08b162b21', 'metadata': {'location': 'Fangorn Forest', 'topic': 'hope and wisdom'}, 'document': 'Gandalf: Even the wisest cannot foresee all ends, but hope remains while the Company is true.'}]\n"
"Updated document: [{'id': 'e92f2785-9044-44ca-8198-1d60797bea24', 'metadata': {'location': 'Fangorn Forest', 'topic': 'hope and wisdom'}, 'document': 'Gandalf: Even the wisest cannot foresee all ends, but hope remains while the Company is true.'}]\n"
]
}
],
@ -297,7 +297,7 @@
},
{
"cell_type": "code",
"execution_count": 9,
"execution_count": 8,
"metadata": {},
"outputs": [
{
@ -328,7 +328,7 @@
},
{
"cell_type": "code",
"execution_count": 10,
"execution_count": 9,
"metadata": {},
"outputs": [
{
@ -364,7 +364,7 @@
},
{
"cell_type": "code",
"execution_count": 11,
"execution_count": 10,
"metadata": {},
"outputs": [],
"source": [
@ -381,13 +381,13 @@
},
{
"cell_type": "code",
"execution_count": 12,
"execution_count": 11,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"{'ids': [['3193af8f-6bb8-45d7-a08c-dfd08b162b21']],\n",
"{'ids': [['e92f2785-9044-44ca-8198-1d60797bea24']],\n",
" 'embeddings': None,\n",
" 'documents': [['Gandalf: Even the wisest cannot foresee all ends, but hope remains while the Company is true.']],\n",
" 'uris': None,\n",
@ -399,7 +399,7 @@
" <IncludeEnum.metadatas: 'metadatas'>]}"
]
},
"execution_count": 12,
"execution_count": 11,
"metadata": {},
"output_type": "execute_result"
}
@ -419,7 +419,7 @@
},
{
"cell_type": "code",
"execution_count": 13,
"execution_count": 12,
"metadata": {},
"outputs": [
{
@ -428,7 +428,7 @@
"['example_collection']"
]
},
"execution_count": 13,
"execution_count": 12,
"metadata": {},
"output_type": "execute_result"
}
@ -439,7 +439,7 @@
},
{
"cell_type": "code",
"execution_count": 14,
"execution_count": 13,
"metadata": {},
"outputs": [],
"source": [
@ -449,7 +449,7 @@
},
{
"cell_type": "code",
"execution_count": 15,
"execution_count": 14,
"metadata": {},
"outputs": [
{
@ -458,7 +458,7 @@
"[]"
]
},
"execution_count": 15,
"execution_count": 14,
"metadata": {},
"output_type": "execute_result"
}
@ -466,6 +466,13 @@
"source": [
"store.client.list_collections()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
@ -484,7 +491,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.12.8"
"version": "3.12.1"
}
},
"nbformat": 4,

View File

@ -6,7 +6,7 @@
"source": [
"# VectorStore: Chroma and Sentence Transformer (all-MiniLM-L6-v2) with Basic Examples\n",
"\n",
"This notebook demonstrates how to use the `ChromaVectorStore` in `Floki` for storing, querying, and filtering documents. We will explore:\n",
"This notebook demonstrates how to use the `ChromaVectorStore` in `dapr-agents` for storing, querying, and filtering documents. We will explore:\n",
"\n",
"* Initializing the `SentenceTransformerEmbedder` embedding function and `ChromaVectorStore`.\n",
"* Adding documents with text and metadata.\n",
@ -31,7 +31,7 @@
"metadata": {},
"outputs": [],
"source": [
"!pip install floki-ai chromadb"
"!pip install dapr-agents python-dotenv chromadb sentence-transformers"
]
},
{
@ -75,11 +75,11 @@
},
{
"cell_type": "code",
"execution_count": null,
"execution_count": 2,
"metadata": {},
"outputs": [],
"source": [
"from floki.document.embedder import SentenceTransformerEmbedder\n",
"from dapr_agents.document.embedder import SentenceTransformerEmbedder\n",
"\n",
"embedding_function = SentenceTransformerEmbedder(\n",
" model=\"all-MiniLM-L6-v2\"\n",
@ -97,11 +97,11 @@
},
{
"cell_type": "code",
"execution_count": 5,
"execution_count": 3,
"metadata": {},
"outputs": [],
"source": [
"from floki.storage import ChromaVectorStore\n",
"from dapr_agents.storage import ChromaVectorStore\n",
"\n",
"# Initialize ChromaVectorStore\n",
"store = ChromaVectorStore(\n",
@ -130,11 +130,11 @@
},
{
"cell_type": "code",
"execution_count": 6,
"execution_count": 4,
"metadata": {},
"outputs": [],
"source": [
"from floki.types.document import Document\n",
"from dapr_agents.types.document import Document\n",
"\n",
"# Example Lord of the Rings-inspired conversations\n",
"documents = [\n",
@ -190,7 +190,7 @@
},
{
"cell_type": "code",
"execution_count": 7,
"execution_count": 5,
"metadata": {},
"outputs": [
{
@ -217,7 +217,7 @@
},
{
"cell_type": "code",
"execution_count": 8,
"execution_count": 6,
"metadata": {},
"outputs": [
{
@ -225,16 +225,16 @@
"output_type": "stream",
"text": [
"Retrieved documents:\n",
"ID: b6020c96-2c81-452f-b01f-a7143d6aacff, Text: Gandalf: A wizard is never late, Frodo Baggins. Nor is he early; he arrives precisely when he means to., Metadata: {'location': 'The Shire', 'topic': 'wisdom'}\n",
"ID: f864aba5-1c70-451c-8c9b-e681cb5dc1c2, Text: Frodo: I wish the Ring had never come to me. I wish none of this had happened., Metadata: {'location': 'Moria', 'topic': 'destiny'}\n",
"ID: b3bce064-a6f3-4b8c-9c5b-66f9664b5c4a, Text: Aragorn: You cannot wield it! None of us can. The One Ring answers to Sauron alone. It has no other master., Metadata: {'location': 'Rivendell', 'topic': 'power'}\n",
"ID: 3bd8be9e-8573-4a10-b83b-ab2e46f11045, Text: Sam: I can't carry it for you, but I can carry you!, Metadata: {'location': 'Mount Doom', 'topic': 'friendship'}\n",
"ID: b51c9a0d-4698-46fa-af42-8878dd0466f8, Text: Legolas: A red sun rises. Blood has been spilled this night., Metadata: {'location': 'Rohan', 'topic': 'war'}\n",
"ID: fe633494-08ee-4c8e-86d4-5d54331a9896, Text: Gimli: Certainty of death. Small chance of success. What are we waiting for?, Metadata: {'location': \"Helm's Deep\", 'topic': 'bravery'}\n",
"ID: 6e2676a6-79b7-4837-9c2d-c93aebeb046e, Text: Boromir: One does not simply walk into Mordor., Metadata: {'location': 'Rivendell', 'topic': 'impossible tasks'}\n",
"ID: 2b2aeda6-2629-46d3-8f5c-6bafac9b893f, Text: Galadriel: Even the smallest person can change the course of the future., Metadata: {'location': 'Lothlórien', 'topic': 'hope'}\n",
"ID: 8aee61b6-9e7a-4187-bf1e-b269c27776a6, Text: Théoden: So it begins., Metadata: {'location': \"Helm's Deep\", 'topic': 'battle'}\n",
"ID: 091fa07a-2672-4d6f-adf8-6812990f440f, Text: Elrond: The strength of the Ring-bearer is failing. In his heart, Frodo begins to understand. The quest will claim his life., Metadata: {'location': 'Rivendell', 'topic': 'sacrifice'}\n"
"ID: 3388fac1-25a3-48b8-a919-eaec467b2bb9, Text: Gandalf: A wizard is never late, Frodo Baggins. Nor is he early; he arrives precisely when he means to., Metadata: {'location': 'The Shire', 'topic': 'wisdom'}\n",
"ID: d9f18129-fac6-4a82-aaf2-aa9546cf9f4e, Text: Frodo: I wish the Ring had never come to me. I wish none of this had happened., Metadata: {'location': 'Moria', 'topic': 'destiny'}\n",
"ID: 6148a8b7-a690-4118-95a1-07a1cf8f8f43, Text: Aragorn: You cannot wield it! None of us can. The One Ring answers to Sauron alone. It has no other master., Metadata: {'location': 'Rivendell', 'topic': 'power'}\n",
"ID: d7543888-350f-497e-9064-bf7a72628205, Text: Sam: I can't carry it for you, but I can carry you!, Metadata: {'location': 'Mount Doom', 'topic': 'friendship'}\n",
"ID: 4fd40aff-ab0c-4d8d-9fc5-7c81b4681b5f, Text: Legolas: A red sun rises. Blood has been spilled this night., Metadata: {'location': 'Rohan', 'topic': 'war'}\n",
"ID: b8095473-c7b5-4b47-8d84-3b711aca3033, Text: Gimli: Certainty of death. Small chance of success. What are we waiting for?, Metadata: {'location': \"Helm's Deep\", 'topic': 'bravery'}\n",
"ID: 79722ac0-55f2-4a09-b069-e71fb0202029, Text: Boromir: One does not simply walk into Mordor., Metadata: {'location': 'Rivendell', 'topic': 'impossible tasks'}\n",
"ID: a35c54ed-17d5-4346-8cda-5e5e715c9fac, Text: Galadriel: Even the smallest person can change the course of the future., Metadata: {'location': 'Lothlórien', 'topic': 'hope'}\n",
"ID: 900660ac-7146-44e5-9bd0-c87dcb8b50cd, Text: Théoden: So it begins., Metadata: {'location': \"Helm's Deep\", 'topic': 'battle'}\n",
"ID: 7ed23e82-d1d1-4730-b323-7c170b24734c, Text: Elrond: The strength of the Ring-bearer is failing. In his heart, Frodo begins to understand. The quest will claim his life., Metadata: {'location': 'Rivendell', 'topic': 'sacrifice'}\n"
]
}
],
@ -257,14 +257,14 @@
},
{
"cell_type": "code",
"execution_count": 9,
"execution_count": 7,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Updated document: [{'id': 'b6020c96-2c81-452f-b01f-a7143d6aacff', 'metadata': {'location': 'Fangorn Forest', 'topic': 'hope and wisdom'}, 'document': 'Gandalf: Even the wisest cannot foresee all ends, but hope remains while the Company is true.'}]\n"
"Updated document: [{'id': '3388fac1-25a3-48b8-a919-eaec467b2bb9', 'metadata': {'location': 'Fangorn Forest', 'topic': 'hope and wisdom'}, 'document': 'Gandalf: Even the wisest cannot foresee all ends, but hope remains while the Company is true.'}]\n"
]
}
],
@ -296,7 +296,7 @@
},
{
"cell_type": "code",
"execution_count": 10,
"execution_count": 8,
"metadata": {},
"outputs": [
{
@ -327,7 +327,7 @@
},
{
"cell_type": "code",
"execution_count": 11,
"execution_count": 9,
"metadata": {},
"outputs": [
{
@ -363,7 +363,7 @@
},
{
"cell_type": "code",
"execution_count": 12,
"execution_count": 10,
"metadata": {},
"outputs": [],
"source": [
@ -380,13 +380,13 @@
},
{
"cell_type": "code",
"execution_count": 13,
"execution_count": 11,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"{'ids': [['b6020c96-2c81-452f-b01f-a7143d6aacff']],\n",
"{'ids': [['3388fac1-25a3-48b8-a919-eaec467b2bb9']],\n",
" 'embeddings': None,\n",
" 'documents': [['Gandalf: Even the wisest cannot foresee all ends, but hope remains while the Company is true.']],\n",
" 'uris': None,\n",
@ -398,7 +398,7 @@
" <IncludeEnum.metadatas: 'metadatas'>]}"
]
},
"execution_count": 13,
"execution_count": 11,
"metadata": {},
"output_type": "execute_result"
}
@ -418,7 +418,7 @@
},
{
"cell_type": "code",
"execution_count": 14,
"execution_count": 12,
"metadata": {},
"outputs": [
{
@ -427,7 +427,7 @@
"['example_collection']"
]
},
"execution_count": 14,
"execution_count": 12,
"metadata": {},
"output_type": "execute_result"
}
@ -438,7 +438,7 @@
},
{
"cell_type": "code",
"execution_count": 15,
"execution_count": 13,
"metadata": {},
"outputs": [],
"source": [
@ -448,7 +448,7 @@
},
{
"cell_type": "code",
"execution_count": 16,
"execution_count": 14,
"metadata": {},
"outputs": [
{
@ -457,7 +457,7 @@
"[]"
]
},
"execution_count": 16,
"execution_count": 14,
"metadata": {},
"output_type": "execute_result"
}
@ -490,7 +490,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.12.8"
"version": "3.12.1"
}
},
"nbformat": 4,

View File

@ -6,7 +6,7 @@
"source": [
"# VectorStore: Postgres and Sentence Transformer (all-MiniLM-L6-v2) with Basic Examples\n",
"\n",
"This notebook demonstrates how to use the `PostgresVectorStore` in Floki for storing, querying, and filtering documents. We will explore:\n",
"This notebook demonstrates how to use the `PostgresVectorStore` in `dapr-agents` for storing, querying, and filtering documents. We will explore:\n",
"\n",
"* Initializing the `SentenceTransformerEmbedder` embedding function and `PostgresVectorStore`.\n",
"* Adding documents with text and metadata.\n",
@ -29,8 +29,7 @@
"metadata": {},
"outputs": [],
"source": [
"!pip install floki-ai\n",
"!pip install \"psycopg[binary,pool]\" pgvector"
"!pip install dapr-agents python-dotenv \"psycopg[binary,pool]\" pgvector"
]
},
{
@ -44,7 +43,7 @@
},
{
"cell_type": "code",
"execution_count": null,
"execution_count": 2,
"metadata": {},
"outputs": [
{
@ -53,8 +52,9 @@
"True"
]
},
"execution_count": 2,
"metadata": {},
"output_type": "display_data"
"output_type": "execute_result"
}
],
"source": [
@ -96,11 +96,11 @@
},
{
"cell_type": "code",
"execution_count": 1,
"execution_count": 3,
"metadata": {},
"outputs": [],
"source": [
"from floki.document.embedder import SentenceTransformerEmbedder\n",
"from dapr_agents.document.embedder import SentenceTransformerEmbedder\n",
"\n",
"embedding_function = SentenceTransformerEmbedder(\n",
" model=\"all-MiniLM-L6-v2\"\n",
@ -118,11 +118,11 @@
},
{
"cell_type": "code",
"execution_count": 2,
"execution_count": 4,
"metadata": {},
"outputs": [],
"source": [
"from floki.storage.vectorstores import PostgresVectorStore\n",
"from dapr_agents.storage.vectorstores import PostgresVectorStore\n",
"import os\n",
"\n",
"# Set up connection parameters\n",
@ -153,11 +153,11 @@
},
{
"cell_type": "code",
"execution_count": 4,
"execution_count": 5,
"metadata": {},
"outputs": [],
"source": [
"from floki.types.document import Document\n",
"from dapr_agents.types.document import Document\n",
"\n",
"# Example Lord of the Rings-inspired conversations\n",
"documents = [\n",
@ -213,7 +213,7 @@
},
{
"cell_type": "code",
"execution_count": 5,
"execution_count": 6,
"metadata": {},
"outputs": [
{
@ -240,7 +240,7 @@
},
{
"cell_type": "code",
"execution_count": 6,
"execution_count": 7,
"metadata": {},
"outputs": [
{
@ -248,16 +248,16 @@
"output_type": "stream",
"text": [
"Retrieved documents:\n",
"ID: 67154404-837e-4310-b641-519e1e7ef7ae, Text: Gandalf: A wizard is never late, Frodo Baggins. Nor is he early; he arrives precisely when he means to., Metadata: {'topic': 'wisdom', 'location': 'The Shire'}\n",
"ID: d1f642a6-78b3-4116-82a5-a89b918ca5dc, Text: Frodo: I wish the Ring had never come to me. I wish none of this had happened., Metadata: {'topic': 'destiny', 'location': 'Moria'}\n",
"ID: 6ba01fc5-7c98-4e19-b38e-dbd6ec8b96b4, Text: Aragorn: You cannot wield it! None of us can. The One Ring answers to Sauron alone. It has no other master., Metadata: {'topic': 'power', 'location': 'Rivendell'}\n",
"ID: 1d8addb9-8676-4e2f-8816-9a974157ba23, Text: Sam: I can't carry it for you, but I can carry you!, Metadata: {'topic': 'friendship', 'location': 'Mount Doom'}\n",
"ID: 48bacce1-65c4-414c-8710-38425b668ee3, Text: Legolas: A red sun rises. Blood has been spilled this night., Metadata: {'topic': 'war', 'location': 'Rohan'}\n",
"ID: d3f06a60-aafe-46df-aa6b-9271f947c8d9, Text: Gimli: Certainty of death. Small chance of success. What are we waiting for?, Metadata: {'topic': 'bravery', 'location': \"Helm's Deep\"}\n",
"ID: d1470289-415d-4425-9d26-af0fe6298baf, Text: Boromir: One does not simply walk into Mordor., Metadata: {'topic': 'impossible tasks', 'location': 'Rivendell'}\n",
"ID: 42290060-27fa-42c1-a251-dc0baf3e352b, Text: Galadriel: Even the smallest person can change the course of the future., Metadata: {'topic': 'hope', 'location': 'Lothlórien'}\n",
"ID: cc8103f1-6f4f-4c93-847f-065baedbc1ee, Text: Théoden: So it begins., Metadata: {'topic': 'battle', 'location': \"Helm's Deep\"}\n",
"ID: 9eb0d8b9-ac41-4b27-b6c0-2ce8ba15e0a0, Text: Elrond: The strength of the Ring-bearer is failing. In his heart, Frodo begins to understand. The quest will claim his life., Metadata: {'topic': 'sacrifice', 'location': 'Rivendell'}\n"
"ID: 28d1981b-04f7-49fa-898f-8c0da1ede9da, Text: Gandalf: A wizard is never late, Frodo Baggins. Nor is he early; he arrives precisely when he means to., Metadata: {'topic': 'wisdom', 'location': 'The Shire'}\n",
"ID: e2a709b1-a8ee-4cca-afbd-c86b511383a8, Text: Frodo: I wish the Ring had never come to me. I wish none of this had happened., Metadata: {'topic': 'destiny', 'location': 'Moria'}\n",
"ID: 0cc7be71-31c0-4f2c-a859-d607ffff4bd8, Text: Aragorn: You cannot wield it! None of us can. The One Ring answers to Sauron alone. It has no other master., Metadata: {'topic': 'power', 'location': 'Rivendell'}\n",
"ID: 764240a8-e95a-484e-aee9-3df5f7389ddc, Text: Sam: I can't carry it for you, but I can carry you!, Metadata: {'topic': 'friendship', 'location': 'Mount Doom'}\n",
"ID: c27142af-750f-45c1-b3c3-cf8633ecf2c9, Text: Legolas: A red sun rises. Blood has been spilled this night., Metadata: {'topic': 'war', 'location': 'Rohan'}\n",
"ID: 0e823415-e948-4c20-83d4-883c40cc3230, Text: Gimli: Certainty of death. Small chance of success. What are we waiting for?, Metadata: {'topic': 'bravery', 'location': \"Helm's Deep\"}\n",
"ID: 41e6658b-133b-41b2-bb1b-90b07c2d1f1f, Text: Boromir: One does not simply walk into Mordor., Metadata: {'topic': 'impossible tasks', 'location': 'Rivendell'}\n",
"ID: 224fed8e-64b3-452c-b56c-cf7f0aebc7e1, Text: Galadriel: Even the smallest person can change the course of the future., Metadata: {'topic': 'hope', 'location': 'Lothlórien'}\n",
"ID: c0f67470-dde0-4944-96f4-5bc1397376be, Text: Théoden: So it begins., Metadata: {'topic': 'battle', 'location': \"Helm's Deep\"}\n",
"ID: 953a84fe-e354-48b2-82b0-51b2c9b4a4e4, Text: Elrond: The strength of the Ring-bearer is failing. In his heart, Frodo begins to understand. The quest will claim his life., Metadata: {'topic': 'sacrifice', 'location': 'Rivendell'}\n"
]
}
],
@ -271,14 +271,14 @@
},
{
"cell_type": "code",
"execution_count": 7,
"execution_count": 8,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Specific document: [{'id': UUID('67154404-837e-4310-b641-519e1e7ef7ae'), 'document': 'Gandalf: A wizard is never late, Frodo Baggins. Nor is he early; he arrives precisely when he means to.', 'metadata': {'topic': 'wisdom', 'location': 'The Shire'}}]\n"
"Specific document: [{'id': UUID('28d1981b-04f7-49fa-898f-8c0da1ede9da'), 'document': 'Gandalf: A wizard is never late, Frodo Baggins. Nor is he early; he arrives precisely when he means to.', 'metadata': {'topic': 'wisdom', 'location': 'The Shire'}}]\n"
]
}
],
@ -291,7 +291,7 @@
},
{
"cell_type": "code",
"execution_count": 8,
"execution_count": 9,
"metadata": {},
"outputs": [
{
@ -321,14 +321,14 @@
},
{
"cell_type": "code",
"execution_count": 9,
"execution_count": 10,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Updated document: [{'id': UUID('67154404-837e-4310-b641-519e1e7ef7ae'), 'document': 'Gandalf: Even the wisest cannot foresee all ends, but hope remains while the Company is true.', 'metadata': {'topic': 'hope and wisdom', 'location': 'Fangorn Forest'}}]\n"
"Updated document: [{'id': UUID('28d1981b-04f7-49fa-898f-8c0da1ede9da'), 'document': 'Gandalf: Even the wisest cannot foresee all ends, but hope remains while the Company is true.', 'metadata': {'topic': 'hope and wisdom', 'location': 'Fangorn Forest'}}]\n"
]
}
],
@ -360,7 +360,7 @@
},
{
"cell_type": "code",
"execution_count": 10,
"execution_count": 11,
"metadata": {},
"outputs": [
{
@ -391,7 +391,7 @@
},
{
"cell_type": "code",
"execution_count": 11,
"execution_count": 12,
"metadata": {},
"outputs": [
{
@ -399,8 +399,8 @@
"output_type": "stream",
"text": [
"Similarity search results:\n",
"ID: d3f06a60-aafe-46df-aa6b-9271f947c8d9, Document: Gimli: Certainty of death. Small chance of success. What are we waiting for?, Metadata: {'topic': 'bravery', 'location': \"Helm's Deep\"}, Similarity: 0.1567628941818613\n",
"ID: d1470289-415d-4425-9d26-af0fe6298baf, Document: Boromir: One does not simply walk into Mordor., Metadata: {'topic': 'impossible tasks', 'location': 'Rivendell'}, Similarity: 0.13233356090384096\n"
"ID: 0e823415-e948-4c20-83d4-883c40cc3230, Document: Gimli: Certainty of death. Small chance of success. What are we waiting for?, Metadata: {'topic': 'bravery', 'location': \"Helm's Deep\"}, Similarity: 0.1567628941818613\n",
"ID: 41e6658b-133b-41b2-bb1b-90b07c2d1f1f, Document: Boromir: One does not simply walk into Mordor., Metadata: {'topic': 'impossible tasks', 'location': 'Rivendell'}, Similarity: 0.13233356090384096\n"
]
}
],
@ -426,7 +426,7 @@
},
{
"cell_type": "code",
"execution_count": 12,
"execution_count": 13,
"metadata": {},
"outputs": [
{
@ -434,7 +434,7 @@
"output_type": "stream",
"text": [
"Filtered search results:\n",
"ID: 67154404-837e-4310-b641-519e1e7ef7ae, Document: Gandalf: Even the wisest cannot foresee all ends, but hope remains while the Company is true., Metadata: {'topic': 'hope and wisdom', 'location': 'Fangorn Forest'}, Similarity: 0.1670202911216282\n"
"ID: 28d1981b-04f7-49fa-898f-8c0da1ede9da, Document: Gandalf: Even the wisest cannot foresee all ends, but hope remains while the Company is true., Metadata: {'topic': 'hope and wisdom', 'location': 'Fangorn Forest'}, Similarity: 0.1670202911216282\n"
]
}
],
@ -465,7 +465,7 @@
},
{
"cell_type": "code",
"execution_count": 13,
"execution_count": 14,
"metadata": {},
"outputs": [
{
@ -481,6 +481,13 @@
"store.reset()\n",
"print(\"Database reset complete. Current documents:\", store.get())"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
@ -499,7 +506,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.12.8"
"version": "3.12.1"
}
},
"nbformat": 4,

View File

@ -1,5 +1,5 @@
from floki import WorkflowApp
from floki.types import DaprWorkflowContext
from dapr_agents import WorkflowApp
from dapr_agents.types import DaprWorkflowContext
wfapp = WorkflowApp()

View File

@ -1,5 +1,5 @@
from floki import WorkflowApp
from floki.types import DaprWorkflowContext
from dapr_agents import WorkflowApp
from dapr_agents.types import DaprWorkflowContext
from dotenv import load_dotenv
# Load environment variables

View File

@ -1,5 +1,5 @@
from floki import WorkflowApp
from floki.types import DaprWorkflowContext
from dapr_agents import WorkflowApp
from dapr_agents.types import DaprWorkflowContext
from pydantic import BaseModel
from dotenv import load_dotenv
import logging

View File

@ -1,3 +1,3 @@
floki-ai
dapr_agents
pydub
pypdf

View File

@ -1,13 +1,13 @@
from floki.document.reader.pdf.pypdf import PyPDFReader
from floki.types import DaprWorkflowContext
from floki import WorkflowApp
from dapr_agents.document.reader.pdf.pypdf import PyPDFReader
from dapr_agents.types import DaprWorkflowContext
from dapr_agents import WorkflowApp
from urllib.parse import urlparse, unquote
from dotenv import load_dotenv
from typing import Dict, Any, List
from pydantic import BaseModel
from pathlib import Path
from floki import OpenAIAudioClient
from floki.types.llm import AudioSpeechRequest
from dapr_agents import OpenAIAudioClient
from dapr_agents.types.llm import AudioSpeechRequest
from pydub import AudioSegment
import io
import requests

View File

@ -1,12 +1,12 @@
# Multi-Agent LOTR: Event-Driven Workflow
This guide shows you how to set up and run an event-driven agentic workflow using Floki. By leveraging [Dapr Pub/Sub](https://docs.dapr.io/developing-applications/building-blocks/pubsub/pubsub-overview/) and FastAPI, `Floki` enables agents to collaborate dynamically in decentralized systems.
This guide shows you how to set up and run an event-driven agentic workflow using `dapr-agents`. By leveraging [Dapr Pub/Sub](https://docs.dapr.io/developing-applications/building-blocks/pubsub/pubsub-overview/) and FastAPI, `dapr-agents` enables agents to collaborate dynamically in decentralized systems.
## Prerequisites
Before you start, ensure you have the following:
* [Floki environment set up](https://cyb3rward0g.github.io/floki/home/installation/), including Python 3.8 or higher and Dapr CLI.
* [Dapr Agents environment set up](https://cyb3rward0g.github.io/floki/home/installation/), including Python 3.8 or higher and Dapr CLI.
* Docker installed and running.
* Basic understanding of microservices and event-driven architecture.

View File

@ -1,4 +1,4 @@
from floki import Agent, AgentService
from dapr_agents import Agent, AgentService
from dotenv import load_dotenv
import asyncio
import logging

View File

@ -1,4 +1,4 @@
from floki import Agent, AgentService
from dapr_agents import Agent, AgentService
from dotenv import load_dotenv
import asyncio
import logging

View File

@ -1,4 +1,4 @@
from floki import Agent, AgentService
from dapr_agents import Agent, AgentService
from dotenv import load_dotenv
import asyncio
import logging

View File

@ -1,4 +1,4 @@
from floki import LLMWorkflowService
from dapr_agents import LLMWorkflowService
from dotenv import load_dotenv
import asyncio
import logging

View File

@ -1,4 +1,4 @@
from floki import RandomWorkflowService
from dapr_agents import RandomWorkflowService
from dotenv import load_dotenv
import asyncio
import logging

View File

@ -1,4 +1,4 @@
from floki import RoundRobinWorkflowService
from dapr_agents import RoundRobinWorkflowService
from dotenv import load_dotenv
import asyncio
import logging

11
dapr_agents/__init__.py Normal file
View File

@ -0,0 +1,11 @@
from dapr_agents.agent import (
Agent, AgentService,
AgenticWorkflowService, RoundRobinWorkflowService, RandomWorkflowService,
LLMWorkflowService, ReActAgent, ToolCallAgent, OpenAPIReActAgent
)
from dapr_agents.llm.openai import OpenAIChatClient, OpenAIAudioClient, OpenAIEmbeddingClient
from dapr_agents.llm.huggingface import HFHubChatClient
from dapr_agents.llm.nvidia import NVIDIAChatClient, NVIDIAEmbeddingClient
from dapr_agents.llm.elevenlabs import ElevenLabsSpeechClient
from dapr_agents.tool import AgentTool, tool
from dapr_agents.workflow import WorkflowApp

View File

@ -1,6 +1,6 @@
from floki.types.agent import AgentActorState, AgentActorMessage, AgentStatus, AgentTaskEntry, AgentTaskStatus
from floki.agent.actor.interface import AgentActorInterface
from floki.agent.base import AgentBase
from dapr_agents.types.agent import AgentActorState, AgentActorMessage, AgentStatus, AgentTaskEntry, AgentTaskStatus
from dapr_agents.agent.actor.interface import AgentActorInterface
from dapr_agents.agent.base import AgentBase
from dapr.actor.runtime.context import ActorRuntimeContext
from dapr.actor import Actor
from dapr.actor.id import ActorId

View File

@ -1,6 +1,6 @@
from abc import abstractmethod
from dapr.actor import ActorInterface, actormethod
from floki.types.agent import AgentActorMessage, AgentStatus
from dapr_agents.types.agent import AgentActorMessage, AgentStatus
from typing import Union, List, Optional
class AgentActorInterface(ActorInterface):

View File

@ -1,11 +1,11 @@
from floki.memory import MemoryBase, ConversationListMemory, ConversationVectorMemory
from floki.agent.utils.text_printer import ColorTextFormatter
from floki.types import MessageContent, MessagePlaceHolder
from floki.tool.executor import AgentToolExecutor
from floki.prompt.base import PromptTemplateBase
from floki.llm import LLMClientBase, OpenAIChatClient
from floki.prompt import ChatPromptTemplate
from floki.tool.base import AgentTool
from dapr_agents.memory import MemoryBase, ConversationListMemory, ConversationVectorMemory
from dapr_agents.agent.utils.text_printer import ColorTextFormatter
from dapr_agents.types import MessageContent, MessagePlaceHolder
from dapr_agents.tool.executor import AgentToolExecutor
from dapr_agents.prompt.base import PromptTemplateBase
from dapr_agents.llm import LLMClientBase, OpenAIChatClient
from dapr_agents.prompt import ChatPromptTemplate
from dapr_agents.tool.base import AgentTool
from typing import List, Optional, Dict, Any, Union, Callable, Literal
from pydantic import BaseModel, Field, PrivateAttr, model_validator, ConfigDict
from abc import ABC, abstractmethod

View File

@ -1,7 +1,7 @@
from floki.tool.utils.openapi import OpenAPISpecParser, openapi_spec_to_openai_fn
from floki.agent.patterns.react import ReActAgent
from floki.storage import VectorStoreBase
from floki.tool.storage import VectorToolStore
from dapr_agents.tool.utils.openapi import OpenAPISpecParser, openapi_spec_to_openai_fn
from dapr_agents.agent.patterns.react import ReActAgent
from dapr_agents.storage import VectorStoreBase
from dapr_agents.tool.storage import VectorToolStore
from typing import Dict, Optional, List, Any
from pydantic import Field, ConfigDict
import logging

View File

@ -1,7 +1,7 @@
from floki.tool.utils.openapi import OpenAPISpecParser
from floki.tool.storage import VectorToolStore
from floki.tool.base import tool
from dapr_agents.tool.utils.openapi import OpenAPISpecParser
from dapr_agents.tool.storage import VectorToolStore
from dapr_agents.tool.base import tool
from pydantic import BaseModel ,Field, ConfigDict
from typing import Optional, Any, Dict
from urllib.parse import urlparse

View File

@ -1,6 +1,6 @@
from floki.types import AgentError, AssistantMessage, ChatCompletion, FunctionCall
from floki.agent import AgentBase
from floki.tool import AgentTool
from dapr_agents.types import AgentError, AssistantMessage, ChatCompletion, FunctionCall
from dapr_agents.agent import AgentBase
from dapr_agents.tool import AgentTool
from typing import List, Dict, Any, Union, Callable, Literal, Optional, Tuple
from datetime import datetime
from pydantic import Field, ConfigDict

View File

@ -1,5 +1,5 @@
from floki.types import AgentError, AssistantMessage, ChatCompletion, ToolMessage
from floki.agent import AgentBase
from dapr_agents.types import AgentError, AssistantMessage, ChatCompletion, ToolMessage
from dapr_agents.agent import AgentBase
from typing import List, Optional, Dict, Any, Union
from pydantic import Field, ConfigDict
import logging

View File

@ -5,12 +5,12 @@ from dapr.actor import ActorProxy, ActorId
from fastapi import FastAPI, Depends, HTTPException, Request, Response, status
from fastapi.encoders import jsonable_encoder
from fastapi.responses import JSONResponse
from floki.agent.services.messaging import parse_cloudevent
from floki.storage.daprstores.statestore import DaprStateStore
from floki.agent.actor import AgentActorBase, AgentActorInterface
from floki.service.fastapi import DaprEnabledService
from floki.types.agent import AgentActorMessage
from floki.agent import AgentBase
from dapr_agents.agent.services.messaging import parse_cloudevent
from dapr_agents.storage.daprstores.statestore import DaprStateStore
from dapr_agents.agent.actor import AgentActorBase, AgentActorInterface
from dapr_agents.service.fastapi import DaprEnabledService
from dapr_agents.types.agent import AgentActorMessage
from dapr_agents.agent import AgentBase
from pydantic import BaseModel, Field, model_validator, ConfigDict
from typing import Optional, Any, Union
from contextlib import asynccontextmanager

View File

@ -1,7 +1,7 @@
from floki.agent.services.base import AgentServiceBase
from floki.agent.services.messaging import message_router
from floki.types.agent import AgentActorMessage
from floki.types.message import BaseMessage, EventMessageMetadata
from dapr_agents.agent.services.base import AgentServiceBase
from dapr_agents.agent.services.messaging import message_router
from dapr_agents.types.agent import AgentActorMessage
from dapr_agents.types.message import BaseMessage, EventMessageMetadata
from fastapi import Response, status
from pydantic import BaseModel
from typing import Optional

View File

@ -1,5 +1,5 @@
from typing import Optional, Any, Callable, get_type_hints, Tuple, Type, Dict
from floki.types.message import EventMessageMetadata
from dapr_agents.types.message import EventMessageMetadata
from pydantic import BaseModel, ValidationError
from inspect import signature, Parameter
from cloudevents.http.event import CloudEvent

View File

@ -1,11 +1,11 @@
from floki.agent.patterns import ReActAgent, ToolCallAgent, OpenAPIReActAgent
from floki.tool.utils.openapi import OpenAPISpecParser
from floki.memory import ConversationListMemory
from floki.llm import OpenAIChatClient
from floki.agent.base import AgentBase
from floki.llm import LLMClientBase
from floki.memory import MemoryBase
from floki.tool import AgentTool
from dapr_agents.agent.patterns import ReActAgent, ToolCallAgent, OpenAPIReActAgent
from dapr_agents.tool.utils.openapi import OpenAPISpecParser
from dapr_agents.memory import ConversationListMemory
from dapr_agents.llm import OpenAIChatClient
from dapr_agents.agent.base import AgentBase
from dapr_agents.llm import LLMClientBase
from dapr_agents.memory import MemoryBase
from dapr_agents.tool import AgentTool
from typing import Optional, List, Union, Type, TypeVar
T = TypeVar('T', ToolCallAgent, ReActAgent, OpenAPIReActAgent)

View File

@ -1,4 +1,4 @@
from floki.types import BaseMessage
from dapr_agents.types import BaseMessage
from typing import List
from pydantic import ValidationError

View File

@ -1,14 +1,14 @@
from floki.types.message import BaseMessage
from dapr_agents.types.message import BaseMessage
from typing import Optional, Any, Union, Dict
from colorama import Style
# Define your custom colors as a dictionary
COLORS = {
"floki_teal": '\033[38;2;147;191;183m',
"floki_mustard": '\033[38;2;242;182;128m',
"floki_red": '\033[38;2;217;95;118m',
"floki_pink": '\033[38;2;191;69;126m',
"floki_purple": '\033[38;2;146;94;130m',
"dapr_agents_teal": '\033[38;2;147;191;183m',
"dapr_agents_mustard": '\033[38;2;242;182;128m',
"dapr_agents_red": '\033[38;2;217;95;118m',
"dapr_agents_pink": '\033[38;2;191;69;126m',
"dapr_agents_purple": '\033[38;2;146;94;130m',
"reset": Style.RESET_ALL
}
@ -83,10 +83,10 @@ class ColorTextFormatter:
content = message.get("content", "")
color_map = {
"user": "floki_mustard",
"assistant": "floki_teal",
"tool_calls": "floki_red",
"tool": "floki_pink"
"user": "dapr_agents_mustard",
"assistant": "dapr_agents_teal",
"tool_calls": "dapr_agents_red",
"tool": "dapr_agents_pink"
}
# Handle tool calls
@ -135,9 +135,9 @@ class ColorTextFormatter:
content (str): The content to print.
"""
color_map = {
"Thought": "floki_red",
"Action": "floki_pink",
"Observation": "floki_purple"
"Thought": "dapr_agents_red",
"Action": "dapr_agents_pink",
"Observation": "dapr_agents_purple"
}
# Get the color for the part type, defaulting to reset if not found

View File

@ -1,6 +1,6 @@
from floki.storage.daprstores.statestore import DaprStateStore
from floki.agent.utils.text_printer import ColorTextFormatter
from floki.workflow.service import WorkflowAppService
from dapr_agents.storage.daprstores.statestore import DaprStateStore
from dapr_agents.agent.utils.text_printer import ColorTextFormatter
from dapr_agents.workflow.service import WorkflowAppService
from fastapi import HTTPException
from pydantic import BaseModel, Field
from typing import Any, Optional, Union
@ -118,11 +118,11 @@ class AgenticWorkflowService(WorkflowAppService):
# Print sender -> recipient and the message
interaction_text = [
(sender_agent_name, "floki_mustard"),
(" -> ", "floki_teal"),
(f"{recipient_agent_name}\n\n", "floki_mustard"),
(sender_agent_name, "dapr_agents_mustard"),
(" -> ", "dapr_agents_teal"),
(f"{recipient_agent_name}\n\n", "dapr_agents_mustard"),
(message + "\n\n", None),
(separator + "\n", "floki_teal"),
(separator + "\n", "dapr_agents_teal"),
]
# Print the formatted text

View File

@ -1,7 +1,7 @@
from floki.agent.workflows.base import AgenticWorkflowService
from floki.types import DaprWorkflowContext, BaseMessage
from floki.llm import LLMClientBase, OpenAIChatClient
from floki.prompt import ChatPromptTemplate
from dapr_agents.agent.workflows.base import AgenticWorkflowService
from dapr_agents.types import DaprWorkflowContext, BaseMessage
from dapr_agents.llm import LLMClientBase, OpenAIChatClient
from dapr_agents.prompt import ChatPromptTemplate
from typing import Dict, Any, Optional
from datetime import timedelta
from pydantic import BaseModel, Field

View File

@ -1,5 +1,5 @@
from floki.agent.workflows.base import AgenticWorkflowService
from floki.types import DaprWorkflowContext, BaseMessage
from dapr_agents.agent.workflows.base import AgenticWorkflowService
from dapr_agents.types import DaprWorkflowContext, BaseMessage
from typing import Dict, Any, Optional
from datetime import timedelta
from pydantic import BaseModel

View File

@ -1,5 +1,5 @@
from floki.agent.workflows.base import AgenticWorkflowService
from floki.types import DaprWorkflowContext, BaseMessage
from dapr_agents.agent.workflows.base import AgenticWorkflowService
from dapr_agents.types import DaprWorkflowContext, BaseMessage
from typing import Dict, Any, Optional
from datetime import timedelta
from pydantic import BaseModel

View File

@ -1,5 +1,5 @@
from floki.llm.nvidia.embeddings import NVIDIAEmbeddingClient
from floki.document.embedder.base import EmbedderBase
from dapr_agents.llm.nvidia.embeddings import NVIDIAEmbeddingClient
from dapr_agents.document.embedder.base import EmbedderBase
from typing import List, Union
from pydantic import Field
import numpy as np

View File

@ -1,5 +1,5 @@
from floki.document.embedder.base import EmbedderBase
from floki.llm.openai.embeddings import OpenAIEmbeddingClient
from dapr_agents.document.embedder.base import EmbedderBase
from dapr_agents.llm.openai.embeddings import OpenAIEmbeddingClient
from typing import List, Any, Union, Optional
from pydantic import Field, ConfigDict
import numpy as np

View File

@ -1,4 +1,4 @@
from floki.document.embedder.base import EmbedderBase
from dapr_agents.document.embedder.base import EmbedderBase
from typing import List, Any, Optional, Union, Literal
from pydantic import Field
import logging

View File

@ -1,5 +1,5 @@
from floki.document.fetcher.base import FetcherBase
from floki.types.document import Document
from dapr_agents.document.fetcher.base import FetcherBase
from dapr_agents.types.document import Document
from typing import List, Dict, Optional, Union, Any
from datetime import datetime
from pathlib import Path

View File

@ -1,4 +1,4 @@
from floki.types.document import Document
from dapr_agents.types.document import Document
from abc import ABC, abstractmethod
from pydantic import BaseModel
from pathlib import Path

View File

@ -1,5 +1,5 @@
from floki.document.reader.base import ReaderBase
from floki.types.document import Document
from dapr_agents.document.reader.base import ReaderBase
from dapr_agents.types.document import Document
from typing import List, Dict, Optional
from pathlib import Path

View File

@ -1,5 +1,5 @@
from floki.types.document import Document
from floki.document.reader.base import ReaderBase
from dapr_agents.types.document import Document
from dapr_agents.document.reader.base import ReaderBase
from typing import List, Dict, Optional
from pathlib import Path

View File

@ -1,5 +1,5 @@
from floki.document.reader.base import ReaderBase
from floki.types.document import Document
from dapr_agents.document.reader.base import ReaderBase
from dapr_agents.types.document import Document
from pathlib import Path
from typing import List
from pydantic import Field

View File

@ -1,7 +1,7 @@
from pydantic import BaseModel, ConfigDict, Field
from abc import ABC, abstractmethod
from typing import List, Optional, Callable
from floki.types.document import Document
from dapr_agents.types.document import Document
import re
import logging

View File

@ -1,4 +1,4 @@
from floki.document.splitter.base import SplitterBase
from dapr_agents.document.splitter.base import SplitterBase
from typing import List
import logging

View File

@ -1,5 +1,5 @@
from floki.prompt.base import PromptTemplateBase
from floki.prompt.prompty import Prompty
from dapr_agents.prompt.base import PromptTemplateBase
from dapr_agents.prompt.prompty import Prompty
from typing import Union, Dict, Any, Optional
from pydantic import BaseModel, Field
from abc import ABC, abstractmethod

View File

@ -1,5 +1,5 @@
from floki.types.llm import ElevenLabsClientConfig
from floki.llm.base import LLMClientBase
from dapr_agents.types.llm import ElevenLabsClientConfig
from dapr_agents.llm.base import LLMClientBase
from typing import Any, Optional
from pydantic import Field
import os

View File

@ -1,4 +1,4 @@
from floki.llm.elevenlabs.client import ElevenLabsClientBase
from dapr_agents.llm.elevenlabs.client import ElevenLabsClientBase
from typing import Optional, Union, Any
from pydantic import Field
import logging

View File

@ -1,9 +1,9 @@
from floki.llm.huggingface.client import HFHubInferenceClientBase
from floki.llm.utils import RequestHandler, ResponseHandler
from floki.prompt.prompty import Prompty
from floki.types.message import BaseMessage
from floki.llm.chat import ChatClientBase
from floki.tool import AgentTool
from dapr_agents.llm.huggingface.client import HFHubInferenceClientBase
from dapr_agents.llm.utils import RequestHandler, ResponseHandler
from dapr_agents.prompt.prompty import Prompty
from dapr_agents.types.message import BaseMessage
from dapr_agents.llm.chat import ChatClientBase
from dapr_agents.tool import AgentTool
from typing import Union, Optional, Iterable, Dict, Any, List, Iterator, Type
from pydantic import BaseModel
from pathlib import Path

View File

@ -1,5 +1,5 @@
from floki.types.llm import HFInferenceClientConfig
from floki.llm.base import LLMClientBase
from dapr_agents.types.llm import HFInferenceClientConfig
from dapr_agents.llm.base import LLMClientBase
from typing import Optional, Dict, Any, Union
from huggingface_hub import InferenceClient
from pydantic import Field, model_validator

View File

@ -1,9 +1,9 @@
from floki.llm.utils import RequestHandler, ResponseHandler
from floki.llm.nvidia.client import NVIDIAClientBase
from floki.types.message import BaseMessage
from floki.llm.chat import ChatClientBase
from floki.prompt.prompty import Prompty
from floki.tool import AgentTool
from dapr_agents.llm.utils import RequestHandler, ResponseHandler
from dapr_agents.llm.nvidia.client import NVIDIAClientBase
from dapr_agents.types.message import BaseMessage
from dapr_agents.llm.chat import ChatClientBase
from dapr_agents.prompt.prompty import Prompty
from dapr_agents.tool import AgentTool
from typing import Union, Optional, Iterable, Dict, Any, List, Iterator, Type
from openai.types.chat import ChatCompletionMessage
from pydantic import BaseModel, Field

View File

@ -1,5 +1,5 @@
from floki.types.llm import NVIDIAClientConfig
from floki.llm.base import LLMClientBase
from dapr_agents.types.llm import NVIDIAClientConfig
from dapr_agents.llm.base import LLMClientBase
from typing import Any, Optional
from pydantic import Field
from openai import OpenAI

View File

@ -1,5 +1,5 @@
from openai.types.create_embedding_response import CreateEmbeddingResponse
from floki.llm.nvidia.client import NVIDIAClientBase
from dapr_agents.llm.nvidia.client import NVIDIAClientBase
from typing import Union, Dict, Any, Literal, List, Optional
from pydantic import Field
import logging

View File

@ -1,6 +1,6 @@
from floki.llm.openai.client.base import OpenAIClientBase
from floki.llm.utils import RequestHandler
from floki.types.llm import (
from dapr_agents.llm.openai.client.base import OpenAIClientBase
from dapr_agents.llm.utils import RequestHandler
from dapr_agents.types.llm import (
AudioSpeechRequest, AudioTranscriptionRequest,
AudioTranslationRequest, AudioTranscriptionResponse, AudioTranslationResponse,
)

View File

@ -1,10 +1,10 @@
from floki.types.llm import AzureOpenAIModelConfig, OpenAIModelConfig
from floki.llm.utils import RequestHandler, ResponseHandler
from floki.llm.openai.client.base import OpenAIClientBase
from floki.types.message import BaseMessage
from floki.llm.chat import ChatClientBase
from floki.prompt.prompty import Prompty
from floki.tool import AgentTool
from dapr_agents.types.llm import AzureOpenAIModelConfig, OpenAIModelConfig
from dapr_agents.llm.utils import RequestHandler, ResponseHandler
from dapr_agents.llm.openai.client.base import OpenAIClientBase
from dapr_agents.types.message import BaseMessage
from dapr_agents.llm.chat import ChatClientBase
from dapr_agents.prompt.prompty import Prompty
from dapr_agents.tool import AgentTool
from typing import Union, Optional, Iterable, Dict, Any, List, Iterator, Type
from openai.types.chat import ChatCompletionMessage
from pydantic import BaseModel, Field, model_validator

View File

@ -1,6 +1,6 @@
from azure.identity import DefaultAzureCredential, ManagedIdentityCredential, get_bearer_token_provider
from floki.types.llm import AzureOpenAIClientConfig
from floki.llm.utils import HTTPHelper
from dapr_agents.types.llm import AzureOpenAIClientConfig
from dapr_agents.llm.utils import HTTPHelper
from openai import AzureOpenAI
from typing import Union, Optional
import logging

View File

@ -1,6 +1,6 @@
from floki.types.llm import OpenAIClientConfig, AzureOpenAIClientConfig
from floki.llm.openai.client import AzureOpenAIClient, OpenAIClient
from floki.llm.base import LLMClientBase
from dapr_agents.types.llm import OpenAIClientConfig, AzureOpenAIClientConfig
from dapr_agents.llm.openai.client import AzureOpenAIClient, OpenAIClient
from dapr_agents.llm.base import LLMClientBase
from openai import OpenAI, AzureOpenAI
from typing import Any, Optional, Union, Dict
from pydantic import Field

View File

@ -1,5 +1,5 @@
from floki.types.llm import OpenAIClientConfig
from floki.llm.utils import HTTPHelper
from dapr_agents.types.llm import OpenAIClientConfig
from dapr_agents.llm.utils import HTTPHelper
from typing import Union, Optional
from openai import OpenAI
import logging

View File

@ -1,5 +1,5 @@
from openai.types.create_embedding_response import CreateEmbeddingResponse
from floki.llm.openai.client.base import OpenAIClientBase
from dapr_agents.llm.openai.client.base import OpenAIClientBase
from typing import Union, Dict, Any, Literal, List, Optional
from pydantic import Field, model_validator
import logging

Some files were not shown because too many files have changed in this diff Show More