I started using it, works great so far, thanks @luceos ππ»
Maybe the only thing that needs a bit of polishing is for it to not mention users twice when replying (it doesn't always do that though, not sure what causes it) and remove the "ChatGPT wrote:" line.
P.S. Also, the AI responds only when mentioned with @. It would be good if it could also reply when mentioned through the βreplyβ link to one of its previous posts.
P.P.S We asked it too many times in a discussion and it stopped responding with the following error in the logs:
This model's maximum context length is 4097 tokens. However, your messages resulted in 4940 tokens. Please reduce the length of the messages
Are you feeding the entire discussion to it as input? Or maybe all of his previous answers? Wondering if that has the potential to incur high costs because the tokens will increase with huge discussions or with long conversations with the bot. Just thinking out loud.
P.P.S. @luceos is the extension thread safe? Because we noticed an odd behavior where it replied to one user but included a mention to another user who was not part of the discussion at all. However that other user maybe posted at around the same time in another discussion: