jaikrishna-tavva9

In your local extend.php add:

(new \FoF\Sitemap\Extend\RemoveResource(\FoF\Sitemap\Resources\User::class)),

    IanM for completions sake that line has to go within the [], see:

    
    return [
        (new \FoF\Sitemap\Extend\RemoveResource(FoF\Sitemap\Resources\User::class)),
    ];

    Didn't the permission also remove the users?

      girefko submitting URLs in sitemap and blocking them using robots using disallow tag is really a bad practice and we will see these URLs as errors in the search console. v17development/flarum-seo63

      make sure this is set to off

        Hari Hi Hari, thanks for the info!

        So I removed Disallow: /u/ in robots.txt and added this code to my extend.php file :

        return [
        (new \FoF\Sitemap\Extend\RemoveResource(FoF\Sitemap\Resources\User::class)),
        ];

        Would that be sufficient to remove user pages from crawling/bot requests?

          girefko i was looking for this code for a few years, but I forgot to ask the community, thanks a lot for sharing could you share a screenshot showing where exactly you have pasted this code?

          girefko Would that be sufficient to remove user pages from crawling/bot requests?

          we should use both because one acts to remove pages from the index and the extend.php code removes user URLs from the sitemap which prevents the spider to crawl them again

          is this looking good?

          Edit: this is not working @girefko

            girefko no luck, i removed that and tired :s

            i still see user URLs in sitemap, deleted sitemap file from public_html and changed the settings to runtime mode. flushed the CF page cache too

            luceos i can not find any such setting called "user listing" in the permission page and also extend php code is not working any suggestions?

            this is not working do you think there should not be this slash? \

            Hari the problem here is that you have two return [] statements. That's not correct.

            For your case, you need something like:

            return [
                (new Blomstra\Redis\Extend\Redis([
                    'host' => '127.0.0.1',
                    'password' => null,
                    'port' => 6379,
                    'database' => 1,
                ]))
                ->useDatabaseWith('queue', 2)
                ->useDatabaseWith('session', 3)
                ->disable('cache'),
            
                (new \FoF\Sitemap\Extend\RemoveResource(\FoF\Sitemap\Resources\User::class)),
            ];

              Hari Hey, glad you found the solution! Can you explain why should I disable the setting for "Allow all bots & crawl full site directory" in the SEO ?

              • Hari replied to this.

                girefko i could not recall it, when i enable it i think the site directory setting included user directory from fof extension (not sure)

                I thought indexing discussions is good enough and there is one more setting called crwal full discussion it basically enables spiders to go through entire discussion which causes huge load on server in case of larger communities.

                a month later

                Mercury shows there’s a new version 2.0.0-beta.1. I guess that’s a mistake. Or can we update to it?

                • IanM replied to this.

                  CyberGene We are currently working on a rewrite of the extension to better support large communities with discussions and/or users in the millions.

                  2.0.0-beta.1 is our first iteration, which works fairly well, but still has some performance issues when the counts are in the multi-million range.

                  For the time being, I'd suggest not upgrading to this, unless you have a specific need to do so. Once we've ironed out all of the issues, we will remove the beta label and publish the new version

                    IanM We are currently working on a rewrite of the extension to better support large communities with discussions and/or users in the millions.

                    Not that my flarum is that big yet but excited to hear about the continued development of this extension!

                    a month later

                    Sitemap version 2.0

                    • PHP 8.0 requirement
                    • Complete rewrite of the deploy system.
                    • Simplification of the available modes: now only runtime and cached multi-file are available.
                    • Runtime now depends on the same logic as multi-file, so it is no longer limited to 50.000 items. But it's still not advisable to use it with so many items.
                    • Added admin setting to exclude all user profiles from sitemap.
                    • Added ForceCached extender.
                    • Added optional "risky" performance improvements. This option is likely not for you. Consult the README for details.

                    To update from version 1.x

                    The update requires PHP 8.0 or greater. If you are still on PHP 7.3 or 7.4, you can continue using version 1.x. There will be no new updates for the 1.x line.

                    If composer update doesn't pick up the new version, you can force the update to version 2.0 with this command:

                    composer require fof/sitemap:"*"

                    Clear the cache:

                    php flarum cache:clear

                    If you were running runtime mode, that's everything that was needed. The extension should work right away.

                    If you were running cache or disk mode, the mode will automatically switch to the new multi-file mode.

                    Then run the following command to generate a first sitemap immediately:

                    php flarum fof:sitemap:build

                    If you had already configured the scheduler, the new build command will automatically run at the previously selected time.

                    If you get a filesystem write error, verify Flarum can write to the public folder. A sitemaps folder will be created and all subsequent write operations will be in that folder.

                    This update was sponsored by Blomstra.

                      i have tried these codes, but did not work

                      composer update fof/sitemap
                      php flarum migrate
                      php flarum cache:clear

                      and

                      composer require fof/sitemap:"*"

                        meihuak is there an error? Can you share your php flarum info output and also the full composer output of the require command?