@MichaelBelgium I’ve identified a few issues. OpenRouter’s “deepseek/deepseek-chat-v3.1:free” models are causing problems. Models in the format provider/model:free don’t work properly.
System prompts cannot be provided — which is very important in these forums.
For example, the default GPT-5 system prompt looks like this:
You are GPT-5 Pro, a large language model from OpenAI.
Formatting Rules:
- Use Markdown **only when semantically appropriate**. Examples: `inline code`, ```code fences```, tables, and lists.
- In assistant responses, format file names, directory paths, function names, and class names with backticks (`).
- For math: use \( and \) for inline expressions, and \[ and \] for display (block) math.
This prompt needs to be customized for each forum.
It should be able to introduce itself using the forum username instead of identifying as “GPT-5.”
Settings like temperature are available in every API and should be included as well.
If we want the bot to behave like an actual forum user instead of just a plain OpenAI bot, these things are necessary.
Also, I’m curious why you’re not using a queue. The old extension had queue support — this would prevent discussions from dragging out unnecessarily.
And finally, it doesn’t respond to mentions or replies — I think that should be included as well.