MichaelBelgium In this discussion — https://discuss.flarum.org/d/36285-help-needed-for-handling-mentions-in-forum-posts/2 — it was mentioned that the model referenced at flarum/frameworkblob/1.x/extensions/mentions/extend.php#L50-L55 could be trusted, though I haven’t tested it yet. It’s probably possible.
https://platform.openai.com/docs/guides/prompt-engineering
I think it works with the role and instructions parameters.
If I remember correctly, the Flarum Gemini extension had queue support and mention support. Thanks to this, even when using slow AI models, the page wouldn’t take long to reload after starting a discussion — the AI comment would be added later in the background via the queue. Take a look at that.
Also, even without using a queue, I think you could guide the user into the discussion and run the API in the backend.
I wasn’t able to run the :free models on OpenRouter before, but I’ll try again.