I was considering adding a rating per website to the scanner, in a similar fashion to SSLLabs or SecurityHeaders.io, but am not really sure how to do it. Any suggestion is welcome.

I guess it doesn't make sense to rate something based on the forum settings or the amount of extensions... The only thing that I think make sense is server and software security.

I was thinking of something like:

  • A+: same as A, but implements recommended security headers
  • A: correctly configured on HTTPS
  • B: correctly configured on HTTPS but with deprecated extensions or suboptimal redirects
  • C: Invalid configuration or HTTP only
  • D: known security issues and/or outdated Flarum and/or vulnerable extensions

    clarkwinkelmann You can actually add a few more features and do a ranking based on few more factors.

    • Loading speed of website

    • Cashing of website

    • Features like Google Captcha and Askimet plugin are enabled or not

    • Whether website is showing any other error or not

    Rest I would ask you to add one other feature to tell users about the theme a website is using. It's one of the most used features in WP so maybe one day, it'll also be a big feature here ?

      gurjyot thanks for the suggestions !

      What do you mean by "cashing" ?

      If there is any critical error on the forum the report should already fail at the url scan section. I could maybe improve it further, but this will basically be a 500 error in most cases and I can't get more details than that ?

      Right now there's no proper theming in Flarum so I can't really extract a name/package (other than the list of extensions). I was thinking of extracting the custom CSS. But it has limitations, as I can only see the compiled CSS and not the source LESS when scanning the forum.

        clarkwinkelmann By caching I mean, whether the website is using browser cache properly for it's users. Since WP has separate cache plugins so it's beneficial there, but I guess it's different in case of Flarum.
        In case of errors, I am not focusing on critical errors (once a stable release is out, there will be less and less critical errors every time). I am talking about small errors which are sometimes missed by website owners and your website can be used to detect them.

        Yeah there is no proper theme in Flarum right now, so this can be left for future use. Although I would suggest not to make anything which can show CSS and LESS because if someone is using custom design for it's website (not to be used by others) and others use that design using your website. Then the second party can be liable for copyright issues.

        clarkwinkelmann I made a similar scanner for Magento (sample scan) and chose to use green | orange | red as simple quality indicators. I wanted to keep it simple, as my target audience were store owners, not server admins.

        I also made a test for TTFB, although that varies wildly for a server on the other side of the globe.

        +1 for the proper cache headers for static assets

        In the long run, as Flarum matures, there will be more security incidents and other flaws to check for (also in extensions). Right now, I think you are almost complete!

        7 days later

        We have passed 50 successful scans from 30 different forums !

        I hope it has already helped multiple forum owners to fix their Flarum settings ?

        In particular I can't emphasize enough how dangerous the exposed folders issue is. Hope every admin fixed it correctly !

        Sanguine yes indeed. I first had some barbarian subdomain detection but I hold it back so I can implement it properly once I find the time. I planned to use that very list for that ?

        Now the IP being used as a domain, that's not intended. I just noticed that earlier ? I thought I had it handled, but it looks like it isn't. I'll try to fix it soon ?

        Another thing I noticed is that if your main domain and your www subdomain redirect to the same place, I have an alert that my www subdomain is not secured in https, while it redirects to the main domain, itself redirected to https.

          BlackSheep could you share the report url you are talking about ? I don't see any recent report matching your description.

          Maybe you're talking about the HSTS flag being red ? You should serve the HSTS header along with a redirect when redirecting away from an HTTPS page, this will secure the domain even if it is different than your canonical domain.

          I see multiple reports of website having messy redirects. Based on my knowledge the most secure way of redirecting is the following:

          • If the url is http, redirect to same page on https and apply HSTS header
          • Only then, if the url is using wrong domain, redirect to same page/homepage on canonical domain (and HTTPS)

          Redirecting to another domain right away when on HTTP prevents using HSTS and therefore typing the same address or following the same link again in the future might expose the user to MITM.

          Redirecting to an HTTP page from an HTTPS page is also very bad practice, in particular if it's on a different domain where the HSTS header is likely not yet applied.

            @clarkwinkelmann Sorry, I think I've been talking nonsense, or misspoke. And since an image speaks 100 times more than a line of text:

            And the nginx configuration that pauses problem: redirection of the www on the main domain :

            server {
              listen 80;
              server_name www.speedrun.cafe;
            
              location / {
                 proxy_set_header Host www.speedrun.cafe;
                 proxy_redirect https://speedrun.cafe/ https://www.speedrun.cafe/;
              }
            }

              BlackSheep I'm not very familiar with nginx but this looks like a transparent reverse proxy configuration, not an HTTP redirect response. This would result in the origin server thinking it's always hit via the correct url, but in fact allowing both domains to work.

              Sanguine if you're able to deploy HTTPS for any domain you resolve it's a lot safer to first redirect to HTTPS.

              Though if you're not using HSTS or 301 redirects it has little benefit. But everybody should be adding HSTS headers now, it doesn't require any particular effort. And if you use HSTS preloading, which is a very good thing, you'll need to secure every single subdomain you serve anyway.

                clarkwinkelmann t's a lot safer to first redirect to HTTPS

                What is the attack vector here? As opposed to directly redirecting to the canonical domain https.

                  Sanguine

                  some scenarios:

                  Website canonical url is https://example.com/. Let's suppose all HTTPS-enabled routes on example.com and www.example.com answer with an HSTS header with a maxAge of 6 months

                  Using HSTS preload of course makes every single one of these scenarios secure against MITM if the user is using a browser with a preload list including your website.

                  If the user is using a browser that doesn't support HSTS or the website isn't serving HSTS headers then scenario 3 will be identical to scenario 2. But support is pretty good https://caniuse.com/#feat=stricttransportsecurity

                  EDIT: If you're redirecting from www to apex and use the includeSubdomains HSTS setting I'm actually not sure whether my point still holds. You can redirect directly to apex over HTTPS and all subdomains will be covered by HSTS. The point still holds true if you're redirecting to a completely different domain or a domain that's at a lower DNS level (like apex to www)

                  EDIT2: by the way MITM does not only mean a proper attack to steal user data or impersonate the website. It could also be an SSL downgrade to inject ads or mine cryptocurrency.

                  Update time !

                  Introducing proper subdomain detection and ratings ! The scanner will always try the www subdomain, but now it stays silent if it doesn't resolve but isn't required. Requirement is based on the Public Suffix List so it should work on any exotic domain but let me know if something isn't right.

                  The ratings works more or less as described in my message above clarkwinkelmann except it currently ignore extensions completely. It will be improved in the future.

                  Damn the homepage doesn't look very bright right now with the last ratings ?

                    clarkwinkelmann Thanks for elaborating! You made it very clear.

                    One issue with freeflarum.com is that the services has a wildcard DNS entry (*.freeflarum.com) which makes www.xxx.freeflarum.com resolve, but the wildcard certificate doesn't catch that (as * in SSL certs don't match dots). So FF users now get a warning, which is probably confusing, as nobody uses the www prefix. Any ideas?

                    I added this to my global Nginx config, to redirect http://www.sub.ff to https://sub.ff:

                    server {
                            listen 80;
                            server_name ~^www\.(?<tag>.+?)\.freeflarum\.com$;
                            return 301 https://$tag.freeflarum.com$request_uri;
                    }

                    But I still get a grade B on your scanner, because there is no valid SSL certificate at https://www.test.freeflarum.com. Please advice ?

                      Sanguine ooops never realized DNS wildcards resolved any number of subdomains below ?

                      That issue reminds me of StackExchange and their many meta subdomains

                      I have no idea how you could reasonably fix this ? I could make my rating a bit more tolerant but I'm not sure it's worth it. It will still show a warning if somebody does type that url and that's not good.

                      Some things that come to mind: generate certificates on the fly for these subdomains (I've heard it's possible) or create certificates for every one of your subdomains and use SNI to serve the correct one, but it's a lot more stuff to monitor...

                      Or maybe you could use a DNS service with pseudo-wildcards that only resolve the third-level domain.