From 3451cd334663afebb273a500e796ed76ca8808fa Mon Sep 17 00:00:00 2001 From: Josh van Leeuwen Date: Tue, 9 Sep 2025 14:11:42 -0400 Subject: [PATCH] Apply suggestions from code review Co-authored-by: Mark Fussell Signed-off-by: Josh van Leeuwen --- .../supported-state-stores/setup-azure-cosmosdb.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/daprdocs/content/en/reference/components-reference/supported-state-stores/setup-azure-cosmosdb.md b/daprdocs/content/en/reference/components-reference/supported-state-stores/setup-azure-cosmosdb.md index 6bd9746e3..9bbc16a04 100644 --- a/daprdocs/content/en/reference/components-reference/supported-state-stores/setup-azure-cosmosdb.md +++ b/daprdocs/content/en/reference/components-reference/supported-state-stores/setup-azure-cosmosdb.md @@ -227,8 +227,8 @@ This particular optimization only makes sense if you are saving large objects to ## Workflow Limitations -The more complex a workflow is (number of activities, child workflows, etc.), the more state operations it will perform per state store transaction. -All input & output values will also be saved to the workflow history, and will be part of an operation of these transactions. +The more complex a workflow is with number of activities, child workflows, etc, the more DB state operations it performs per state store transaction. +All input & output values are saved to the workflow history, and are part of an operation of these transactions. CosmosDB has a [maximum document size of 2MB and maximum transaction size of 100 operations.](https://learn.microsoft.com/azure/cosmos-db/concepts-limits#per-request-limits). This means that the workflow history must not exceed this size, meaning that CosmosDB is not suitable for workflows with large input/output values or larger complex workflows.