tried this too
Cloudflare 1.1.1.1 1.0.0.1
Quad9
Google Public DNS
OpenDNS
and ofcourse Adguard DNS
tried this too
Cloudflare 1.1.1.1 1.0.0.1
Quad9
Google Public DNS
OpenDNS
and ofcourse Adguard DNS
While using Cloudflare might be good for website speed (as any CDN basically), it comes with the risk of centralization (when Cloudflare has issues your website is also down) and a massive privacy issue as Cloudflare can now track all your users and even perform MitM attacks.
In addition many users are shut out of Cloudflare fronted websites as they block some (mostly 3rd world) ISP IP ranges or put them behind multiple captchas to access your website. Same for people browsing with Tor.
IMHO, it is better to fix your website and/or to look for a better server then relying on Cloudflare.
Walys I've recently placed my rabbit sanctuary website behind CF which is running WordPress, and I'm getting a TTFB of 30ms
Personally, I don’t use Cloudflare to hide issues. I use it because no matter how I set a website up it makes the site faster. WordPress. Flarum. Whatever CMS. It doesn’t matter. Their tools make the site faster. And faster in more locations.
I just hope Cloudflare sticks around forever and always keeps their free tier.
davwheat Brother if you use Edge Caching and set Cache Level to Cache Everything for your Static Pages you will always get A grade 90% guaranteed.
Also If anyone is looking for a Tutorial In order to get Cloudflare Boost and Security for Flarum CMS simply see it here
The truth is that most forums with high traffic all use a system like Cloudfare.
Something good has to have.
poVoq While using Cloudflare might be good for website speed (as any CDN basically)
And only sometimes, Cloudflare is not a good CDN, it's known to have bad routing, it sometimes slows down your website a lot. I've had this problem for months on my forum, Cloudflare routed traffic from Italy to the USA or to Australia for no reason.
Moreover, L7 DDoS protection is a bit of a joke, it doesn't handle L7 attacks unless you get tens of thousands of requests per second, it's pretty useless. You have to configure firewall rules by hand and most of the time enable them manually when you're under attack. Under attack mode is always bypassed nowadays.
Apart from this, however, I agree that using a CDN is helpful to protect from L4 attacks and to improve performance when the hosting provider doesn't have good connectivity. For example, if you use some small hosting it's likely that some random ISP doesn't have a good way to reach it (it depends on a lot of things), while if you use OVH/Hetzner/etc. you're likely to not have these problems. It also depends on where your server is located and where your users are located.
Also, you need to consider that a forum contains primarily dynamic content, and the fact that Flarum is a single page application makes CDN caching useful only on the first page load. It might improve the response time for the reasons above, but caching shouldn't play much of a role.
So be careful to use Cloudflare with lower plans, it might seem good because it's free but it's not always the best solution. My recommendation is to not use it just because everyone's doing it, think about it.
matteocontrini And only sometimes, Cloudflare is not a good CDN, it's known to have bad routing, it sometimes slows down your website a lot. I've had this problem for months on my forum, Cloudflare routed traffic from Italy to the USA or to Australia for no reason.
It's not the truth basically, It may be a service outage that day.
matteocontrini it's pretty useless. You have to configure firewall rules
It's not useless, It helps small users like me to prevent from attacks and also you are true but you can't say that its useless If you are in under attack you can simply configure firewall, There is no such Rocket Science in it.
matteocontrini but caching shouldn't play much of a role.
It's not 100% true, Caching plays a ultra major role, Lets take a example 5 users came to your website at same time and You were not using CF so your forum logo, css, js was loaded for each user from your origin server and the 5 requests went to your server at the same time, and Now If 5 users came to my site, I was using Cloudflare Static Resource caching and So my server just delivered api/discussion to each user and the content of that page the css js and images were delivered by Cloudflare's edge network which is super ultra pro fast.
matteocontrini but it's not always the best solution. My recommendation is to not use it just because everyone's doing it, think about it.
As It's something great if everyone is using google so you do one thing remove your google account today, this is absolutely wrong, If the service is good and free so it's obvious everyone will use it and If you want to see what caching of Static content can do then take a look at stats here.
Thanks
Regards
@1Dot
Sorry
If I said too much!
davwheat 30ms is insanely low, and pretty impressive. Anything below 250ms is good in my eyes, with sub-150ms being excellent.
Well, I think I just beat that
davwheat My GitHub Pages + Cloudflare set-up for my personal site manages 70ms for TTFB, which is pretty good for free hosting if I do say so myself!
Absolutely ! That's quite a milestone for shared hosting
For clarity, I'm not just using Cloudflare to mask a poor setup. I have an almost "OCD" like approach to correct setups !
1Dot It's not the truth basically
That's how it is, I have proof and I had tens of users complaining. If you didn't experience the issue, good for you, but don't tell me I'm lying... I can also show you support saying it's normal.
1Dot It's not 100% true, Caching plays a ultra major role
Flarum loads maybe three static files and only on the first page load, the impact IMO is negligible, that's what I'm saying.
matteocontrini That's how it is, I have proof and I had tens of users complaining. If you didn't experience the issue, good for you, but don't tell me I'm lying...
I can also show you support saying it's normal.
OK So maybe they were interested to go against it, As per my knowledge CloudFlare CDN Services are always ready to deliver resources at a glance.
matteocontrini Flarum loads maybe three static files and only on the first page load, the impact IMO is negligible, that's what I'm saying.
Friend, It's great you are talking about a Flarum only but why you forget about 3rd party extensions that adds some extra js css in header and also If you will deliver this 3 files from CF Edge network it will be delivered very fast as from our origin server.
matteocontrini the impact IMO is negligible
Yes you can say, but you forgot about shared hosts and low-end servers. Also what if your user is opening your website from a location that is far away from your server!
[deleted] Well, I think I just beat that
Sorry to ruin your dream, but you may notice that HTTP status code is 403. This probably means that Cloudflare displayed captcha and never requested your server for actual content - these metrics are completely unreliable (unless goal of your website is to display captcha from Cloudflare ).
1Dot It's not 100% true, Caching plays a ultra major role
This isn't true. Flarum is an SPA, meaning that "pages" are dynamically created meaning CF will have no real concept of these. What DOES make a difference is the delivery of the assets required to load the SPA initially. As these are being delivered from CF's edge network from a host near to your location, this improves the initial loading dramatically in most cases. Once the SPA is loaded, CF no longer plays a part unless you are calling another URL from a remote source that the browser does not have load in terms of assets.
1Dot It's not the truth basically, It may be a service outage that day.
That actually IS the truth. CF often re-route traffic as a means of clearing congestion on their own internal infrastructure and until it leaves their core or edge networks, you have no control whatsoever over that.
1Dot you can't say that its useless If you are in under attack you can simply configure firewall,
This depends on the nature of the attack. What @matteocontrini refers to is Layer 7 which is the topmost layer in the OSI model (Application level). Most DDoS attacks are are Levels 1, 2, and 3 (Physical, Data, Network). Application layer attacks would require the CF WAF, which isn't free.
To be completely transparent, you should NOT consider it safe to simply "hide" behind CF as I've alluded to in recent posts. The USP of CF is it's ability to cache at it's edge network but it doesn't operate in the same way as traditional CDN's in the sense that ALL traffic has to pass through CF before it reaches your own site. If that traffic gets re-routed, then it WILL slow your site down.
1Dot As It's something great if everyone is using google so you do one thing remove your google account today, this is absolutely wrong, If the service is good and free so it's obvious everyone will use it
This isn't even a close paradigm. Shared hosting is also "great" until it's oversold and then your website which was originally like a rocket now runs like a snail. CF won't fix that - there are so many providers that hide (recommend placing) sites behind CF to make their own performance inadequacies far less.
In short, CF is not the magic wand you may think it is. It certainly helps, and they have superior routing technology (just research ARGO), but there's an old saying in the UK which is 100% true
"There's no such thing as a free lunch"
This is a phrase used when a supplier or vendor takes you to lunch for "free". It looks like a nice gesture, but it's designed to get more business, therefore, you'll land up paying for that lunch elsewhere eventually. The same applies to CF. You get three page rules, which for the most part will give you enough to create static HTML page rules, disable security for /wp-admin in WordPress etc, then perhaps one more to disable the rocket loader on a particular URL because it screws up the JS.
After that, if you need any more, you have to pay. It's a business after all, and they aren't going to provide their entire suite for free - you'll get a "cut down" version of it meaning that enterprise customers will always take preference, and your site will become much slower because of their preference in terms of routing.
rob006 It's not a Captcha. It's the initial loading of the site, so no dream being spoiled. I see the session being initiated on my server.
1Dot Friend, It's great you are talking about a Flarum only but why you forget about 3rd party extensions that adds some extra js css in header
Extensions bundle their JS and CSS in one single extensions bundle, so it shouldn't be a problem. It would be interesting to find out if the performance improvement derives from static files caching or from the reverse proxy.
The fact that you let all traffic pass through Cloudflare can have advantages but also disadvantages. For example, I'm measuring TTFB through Cloudflare and outside Cloudflare and Cloudflare gives an higher value, so that means that for my particular network and connection between edge and origin it's not worth it after all. And probably for most users in my country, which is the public of my forum.
Through Cloudflare:
Lookup time: 0,001749
Connect time: 0,051389
SSL handshake time: 0,153457
Pre-Transfer time: 0,153590
Redirect time: 0,000000
Time to first byte: 0,684040
Total time: 0,802025
Outside:
Lookup time: 0,000388
Connect time: 0,033664
SSL handshake time: 0,125015
Pre-Transfer time: 0,125168
Redirect time: 0,000000
Time to first byte: 0,608581
Total time: 0,716862
Anyway I don't want to start a discussion I'm saying that adding Cloudflare just to see a very specific ranking improve, that we don't even know how it works and from where it's tested, is not enough to justify using it just because it's free IMO. It's something to consider more carefully. And I'm using Cloudflare myself as you've seen, because I've tried multiple products and ended up there for the moment.
[deleted] Application layer attacks would require the CF WAF, which isn't free.
Actually I was referring to L7 DDoS, which unfortunately on Cloudflare are mitigated only if they're very impacting. For example, if you get 100 requests per seconds Cloudflare doesn't mitigate it, although your server would be certainly KO because PHP is heavy. If you get 1000 r/s, they probably would.
My understanding is that this kind of attack is better handled by Bot Management, which is an Enterprise feature unfortunately.