discourse-ai/app
Sam cf220c530c
FIX: Improve MessageBus efficiency and correctly stop streaming (#1362)
* FIX: Improve MessageBus efficiency and correctly stop streaming

This commit enhances the message bus implementation for AI helper streaming by:

- Adding client_id targeting for message bus publications to ensure only the requesting client receives streaming updates
- Limiting MessageBus backlog size (2) and age (60 seconds) to prevent Redis bloat
- Replacing clearTimeout with Ember's cancel method for proper runloop management, we were leaking a stop
- Adding tests for client-specific message delivery

These changes improve memory usage and make streaming more reliable by ensuring messages are properly directed to the requesting client.

* composer suggestion needed a fix as well.

* backlog size of 2 is risky here cause same channel name is reused between clients
2025-05-23 16:23:06 +10:00
..
controllers/discourse_ai FIX: Improve MessageBus efficiency and correctly stop streaming (#1362) 2025-05-23 16:23:06 +10:00
helpers/discourse_ai/ai_bot FIX: automatically bust cache for share ai assets (#942) 2024-11-22 11:23:15 +11:00
jobs FIX: Improve MessageBus efficiency and correctly stop streaming (#1362) 2025-05-23 16:23:06 +10:00
mailers FEATURE: support sending AI report to an email address (#368) 2023-12-19 17:51:49 +11:00
models FIX: Structured output discrepancies. (#1340) 2025-05-15 11:32:10 -03:00
serializers FEATURE: Examples support for personas. (#1334) 2025-05-13 10:06:16 -03:00
services DEV: Use full URL for problem check message (#1165) 2025-03-05 11:31:23 +08:00
views FIX: automatically bust cache for share ai assets (#942) 2024-11-22 11:23:15 +11:00