ChatGPT: AI-powered Auto-Reply

License Latest Stable Version Total Downloads

A Flarum extension.

The ChatGPT extension for Flarum includes an auto-reply discussion feature, customizable max tokens, and permission controls who can use this feature.

The auto-answer feature uses the text-davinci-003 model to generate quick and accurate responses to users' questions.

Installation

This extension required Flarum 1.7 and PHP 8.1 to use openai-php/client.

Install with composer:

composer require datlechin/flarum-chatgpt:"*"

Updating

composer update datlechin/flarum-chatgpt:"*"
php flarum migrate
php flarum cache:clear

Links

All contributes are welcome 🥰

    Thank you for an amazing extension!!

    It appears that we are unable to change the "Max Tokens" setting in the Admin panel. Whatever value we add, the default is still 100 tokens.

    Even ChatGTP's response is inadequate, but this is an excellent extension with great potential in the near future.

    Beautiful work, Mr. Đạt.

      V0.1.0

      Now you can choose which users to use as a forum's AI assistant. (By entering the user ID in the User Prompt input)

      Currently the features are still very rudimentary, there will be a dropdown to select the user instead of entering the user id in the future.

      datlechin Thanks for the contribution, it's a very interesting way to generate content in forums with very little activity and can encourage participation to increase.

      I have some suggestions to improve the extension:

      • An option to enable replies only on the first message of a discussion (to save tokens and be able to reply to a larger number of different users).
      • The ability to choose the language of the response. Maybe this is not necessary, I have yet to test the extension. I have used ChatGPT writing in Spanish and it has responded in the same language without a problem even though the interface was in English.
      • A token counter to know what the current availability is to use the AI or think about increasing the amount.
      • Select which tags the AI user will work on. To allow filtering by sections.
      • An option to select if we want the generated messages to always be pending approval or not.

      In the future I will try to contribute with a PR and something basic, but for now I don't have enough time and I want to learn a bit about how to develop extensions to be able to help.

        Im sorry... how exactly do I use this?
        I tried a existing Forum, nothing is happening.. started a new one, nothing is happening.
        Do I have to address a User with @?
        I
        m sorry.. I`m confused...

        I have installed:
        openai-php/client.
        composer require datlechin/flarum-chatgpt:"*"
        I entered the API Key
        User prompt is empty

        I have:
        Flarum 1.7.1
        PHP 8.1.15

        Thanx in advance.. 👍

          Dr_Sommer several things make it not work:

          • your api key is invalid
          • you have enabled the option ‘Enable on started discussion'
          • permissions in admin

          This is interesting stuff. Can we use a separate user account for generating answers ?

            r4nchy Yeb, you can enter your user ID in User prompt input. In the future, it will be a dropdown to select user instead 😅

            Hi, thanks for the great extension, just a quick question, can this work with php 8.0?

              after clean upgrade to 1.7x and before start installing extension, I plan to move PHP 8.1 😉

                Hello, excuse my ignorance. But is installing OpenAI PHP something that my hosting must do? I'm on a VPS and I don't know if I can do it.

                  Oshvam
                  If your current server can run Flarum 1.7.1 then, ask your VPS provider to update/select PHP version 8.1x and result at least
                  like this

                    FBI Perfect. I understand then that nothing else is needed other than upgrading my PHP version to 8.x. Correct? I thought something else had to be installed. Thank you very much.