discourse-ai/spec/models
Sam d07cf51653
FEATURE: llm quotas (#1047)
Adds a comprehensive quota management system for LLM models that allows:

- Setting per-group (applied per user in the group) token and usage limits with configurable durations
- Tracking and enforcing token/usage limits across user groups
- Quota reset periods (hourly, daily, weekly, or custom)
-  Admin UI for managing quotas with real-time updates

This system provides granular control over LLM API usage by allowing admins
to define limits on both total tokens and number of requests per group.
Supports multiple concurrent quotas per model and automatically handles
quota resets.


Co-authored-by: Keegan George <kgeorge13@gmail.com>
2025-01-14 15:54:09 +11:00
..
ai_persona_spec.rb FEATURE: smarter persona tethering (#832) 2024-10-16 07:20:31 +11:00
ai_tool_spec.rb FIX: encode parameters returned from LLMs correctly (#889) 2024-11-04 10:07:17 +11:00
completion_prompt_spec.rb FIX: regression, no longer sending examples to AI helper (#993) 2024-12-03 16:03:46 +11:00
llm_model_spec.rb DEV: Prefer ENV key for seeded models (#893) 2024-11-05 06:19:13 -08:00
llm_quota_spec.rb FEATURE: llm quotas (#1047) 2025-01-14 15:54:09 +11:00
llm_quota_usage_spec.rb FEATURE: llm quotas (#1047) 2025-01-14 15:54:09 +11:00
model_accuracy_spec.rb DEV: Update linting (#423) 2024-01-13 00:28:06 +01:00
rag_document_fragment_spec.rb REFACTOR: Separation of concerns for embedding generation. (#1027) 2024-12-16 09:55:39 -03:00
reviewable_ai_chat_message_spec.rb DEV: Fix new Rubocop offenses 2024-03-06 15:23:29 +01:00
reviewable_ai_post_spec.rb DEV: Fix new Rubocop offenses 2024-03-06 15:23:29 +01:00
shared_ai_conversation_spec.rb DEV: Rewire AI bot internals to use LlmModel (#638) 2024-06-18 14:32:14 -03:00
user_option_spec.rb DEV: Clearly separate post/composer helper settings (#747) 2024-08-12 15:40:23 -07:00