discourse-ai/lib/completions/endpoints
Sam 37dbd48513
FIX: implement max_output tokens (anthropic/openai/bedrock/gemini/open router) (#1447)
* FIX: implement max_output tokens (anthropic/openai/bedrock/gemini/open router)

Previously this feature existed but was not implemented
Also updates a bunch of models to in our preset to point to latest

* implementing in base is safer, simpler and easier to manage

* anthropic 3.5 is getting older, lets use 4.0 here and fix spec
2025-06-19 16:00:11 +10:00
..
anthropic.rb FIX: implement max_output tokens (anthropic/openai/bedrock/gemini/open router) (#1447) 2025-06-19 16:00:11 +10:00
aws_bedrock.rb FIX: implement max_output tokens (anthropic/openai/bedrock/gemini/open router) (#1447) 2025-06-19 16:00:11 +10:00
base.rb FIX: implement max_output tokens (anthropic/openai/bedrock/gemini/open router) (#1447) 2025-06-19 16:00:11 +10:00
canned_response.rb FEATURE: Use different personas to power AI helper features. 2025-06-04 14:23:00 -03:00
cohere.rb
fake.rb FEATURE: forum researcher persona for deep research (#1313) 2025-05-14 12:36:16 +10:00
gemini.rb FIX: implement max_output tokens (anthropic/openai/bedrock/gemini/open router) (#1447) 2025-06-19 16:00:11 +10:00
hugging_face.rb
mistral.rb
ollama.rb
open_ai.rb FEATURE: optionally support OpenAI responses API (#1423) 2025-06-11 17:12:25 +10:00
open_router.rb
samba_nova.rb
vllm.rb