edco Just to be sure, to disallow all search engines from crawling my flarum forum I would have to: create and upload a robots.txt file to the site root in the file write (and just that): User-agent: * Disallow: / Right?
luceos edco not all robots respect the robots.txt, especially harvesters do not. You'd be better off putting everything behind a log in or by adding firewalling or ip blocking through htaccess or webserver settings.