• Support
  • Improving Flarum speed with Cloudflare cache

  • [deleted]

davwheat 30ms is insanely low, and pretty impressive. Anything below 250ms is good in my eyes, with sub-150ms being excellent.

Well, I think I just beat that 🙂

davwheat My GitHub Pages + Cloudflare set-up for my personal site manages 70ms for TTFB, which is pretty good for free hosting if I do say so myself! 😂

Absolutely ! That's quite a milestone for shared hosting 🙂

For clarity, I'm not just using Cloudflare to mask a poor setup. I have an almost "OCD" like approach to correct setups !

    1Dot It's not the truth basically

    That's how it is, I have proof and I had tens of users complaining. If you didn't experience the issue, good for you, but don't tell me I'm lying... 🙂 I can also show you support saying it's normal.

    1Dot It's not 100% true, Caching plays a ultra major role

    Flarum loads maybe three static files and only on the first page load, the impact IMO is negligible, that's what I'm saying.

      matteocontrini That's how it is, I have proof and I had tens of users complaining. If you didn't experience the issue, good for you, but don't tell me I'm lying... 🙂 I can also show you support saying it's normal.

      OK So maybe they were interested to go against it, As per my knowledge CloudFlare CDN Services are always ready to deliver resources at a glance.

      matteocontrini Flarum loads maybe three static files and only on the first page load, the impact IMO is negligible, that's what I'm saying.

      Friend, It's great you are talking about a Flarum only but why you forget about 3rd party extensions that adds some extra js css in header and also If you will deliver this 3 files from CF Edge network it will be delivered very fast as from our origin server.

      matteocontrini the impact IMO is negligible

      Yes you can say, but you forgot about shared hosts and low-end servers. Also what if your user is opening your website from a location that is far away from your server!

        [deleted] Well, I think I just beat that 🙂

        Sorry to ruin your dream, but you may notice that HTTP status code is 403. This probably means that Cloudflare displayed captcha and never requested your server for actual content - these metrics are completely unreliable (unless goal of your website is to display captcha from Cloudflare 😃).

          • [deleted]

          1Dot It's not 100% true, Caching plays a ultra major role

          This isn't true. Flarum is an SPA, meaning that "pages" are dynamically created meaning CF will have no real concept of these. What DOES make a difference is the delivery of the assets required to load the SPA initially. As these are being delivered from CF's edge network from a host near to your location, this improves the initial loading dramatically in most cases. Once the SPA is loaded, CF no longer plays a part unless you are calling another URL from a remote source that the browser does not have load in terms of assets.

          1Dot It's not the truth basically, It may be a service outage that day.

          That actually IS the truth. CF often re-route traffic as a means of clearing congestion on their own internal infrastructure and until it leaves their core or edge networks, you have no control whatsoever over that.

          1Dot you can't say that its useless If you are in under attack you can simply configure firewall,

          This depends on the nature of the attack. What @matteocontrini refers to is Layer 7 which is the topmost layer in the OSI model (Application level). Most DDoS attacks are are Levels 1, 2, and 3 (Physical, Data, Network). Application layer attacks would require the CF WAF, which isn't free.

          To be completely transparent, you should NOT consider it safe to simply "hide" behind CF as I've alluded to in recent posts. The USP of CF is it's ability to cache at it's edge network but it doesn't operate in the same way as traditional CDN's in the sense that ALL traffic has to pass through CF before it reaches your own site. If that traffic gets re-routed, then it WILL slow your site down.

          1Dot As It's something great if everyone is using google so you do one thing remove your google account today, this is absolutely wrong, If the service is good and free so it's obvious everyone will use it

          This isn't even a close paradigm. Shared hosting is also "great" until it's oversold and then your website which was originally like a rocket now runs like a snail. CF won't fix that - there are so many providers that hide (recommend placing) sites behind CF to make their own performance inadequacies far less.

          In short, CF is not the magic wand you may think it is. It certainly helps, and they have superior routing technology (just research ARGO), but there's an old saying in the UK which is 100% true

          "There's no such thing as a free lunch"

          This is a phrase used when a supplier or vendor takes you to lunch for "free". It looks like a nice gesture, but it's designed to get more business, therefore, you'll land up paying for that lunch elsewhere eventually. The same applies to CF. You get three page rules, which for the most part will give you enough to create static HTML page rules, disable security for /wp-admin in WordPress etc, then perhaps one more to disable the rocket loader on a particular URL because it screws up the JS.

          After that, if you need any more, you have to pay. It's a business after all, and they aren't going to provide their entire suite for free - you'll get a "cut down" version of it meaning that enterprise customers will always take preference, and your site will become much slower because of their preference in terms of routing.

            • [deleted]

            • Edited

            rob006 It's not a Captcha. It's the initial loading of the site, so no dream being spoiled. I see the session being initiated on my server.

              1Dot Friend, It's great you are talking about a Flarum only but why you forget about 3rd party extensions that adds some extra js css in header

              Extensions bundle their JS and CSS in one single extensions bundle, so it shouldn't be a problem. It would be interesting to find out if the performance improvement derives from static files caching or from the reverse proxy.

              The fact that you let all traffic pass through Cloudflare can have advantages but also disadvantages. For example, I'm measuring TTFB through Cloudflare and outside Cloudflare and Cloudflare gives an higher value, so that means that for my particular network and connection between edge and origin it's not worth it after all. And probably for most users in my country, which is the public of my forum.

              Through Cloudflare:

              Lookup time:		0,001749
              Connect time:		0,051389
              SSL handshake time:	0,153457
              Pre-Transfer time:	0,153590
              Redirect time:		0,000000
              Time to first byte:	0,684040
              
              Total time:		0,802025

              Outside:

              Lookup time:		0,000388
              Connect time:		0,033664
              SSL handshake time:	0,125015
              Pre-Transfer time:	0,125168
              Redirect time:		0,000000
              Time to first byte:	0,608581
              
              Total time:		0,716862

              Anyway I don't want to start a discussion 😅 I'm saying that adding Cloudflare just to see a very specific ranking improve, that we don't even know how it works and from where it's tested, is not enough to justify using it just because it's free IMO. It's something to consider more carefully. And I'm using Cloudflare myself as you've seen, because I've tried multiple products and ended up there for the moment.

                [deleted] Application layer attacks would require the CF WAF, which isn't free.

                Actually I was referring to L7 DDoS, which unfortunately on Cloudflare are mitigated only if they're very impacting. For example, if you get 100 requests per seconds Cloudflare doesn't mitigate it, although your server would be certainly KO because PHP is heavy. If you get 1000 r/s, they probably would.

                My understanding is that this kind of attack is better handled by Bot Management, which is an Enterprise feature unfortunately.

                  • [deleted]

                  matteocontrini My understanding is that this kind of attack is better handled by Bot Management, which is an Enterprise feature unfortunately.

                  Correct, but L7 in the OSI model is in fact application - the topmost tier, so the WAF ruleset would actually apply. However, you can have even a basic ruleset operating on your own host to mitigate some of this traffic at least. The free version has basic BOT detection, but you're right - you'd have to pay to get anything decent.

                    [deleted] I agree with you 100%

                    [deleted] "There's no such thing as a free lunch"

                    I agree with this 100000000%

                    @[deleted] Also a quick question as you are a big, good and specialist in security so can you please tell me is it really safe and using it is good or not?

                    matteocontrini Yes, I agree with you this time!
                    that's good to know you are also using CloudFlare.

                    [deleted] It's not a Captcha. It's the initial loading of the site

                    Your "initial loading" returns 403 status code? That's unusual 😆

                      • [deleted]

                      rob006 It does the same on all sites - even those without CF it seems 😕 and it's the same with discuss. I do get your point though - it certainly looks like a Captcha request but I can't see any matching logs at CF for my sites.

                        • [deleted]

                        1Dot Because that's the Windows Live Writer file, which in retrospect is almost harmless as it will attempt to use xmlrpc, which is disabled. Obscurity is not security.

                        • [deleted]

                        • Edited

                        Justoverclock Yes, but as @rob006 pointed out, your site resolves to HTTP 200, so clearly it IS being blocked by CF (for my site at least) as a bot. What's odd though is that I can clearly see the session being built from this IP via CF to my server, so it does get through.

                        • [deleted]

                        1Dot Yes, but note the 403 error code. You can't rely on this site to present real figures unless you whitelist the IP address at CF. On investigation, it's being blocked by the Browser integrity check but the traffic request does make it to my server which responds.

                        Very odd. 100% unreliable test in this case though.

                        • 1Dot replied to this.