after @[deleted] tweak, now we have an A+
MigrateToFlarum Lab, the health scanner for Flarum
[deleted]
Here's the new result of the Flarum Lab scan
https://lab.migratetoflarum.com/scans/5d453d52-1c59-4b1e-b77c-dc3421cb2a7a
Following askvortsov suggestion, I implemented gzip support and reporting to the Lab.
This means the Lab will now accept gzip for all requests, which can potentially speed up the scan for websites hosted on slower connections or on the other side of the globe (all requests are made from a DigitalOcean Droplet in Germany).
The Lab will also show show whether gzip is supported in the "Assets size" box and how much bandwidth you are saving.
If the server doesn't support gzip, a message will appear in the same box with recommendations on how to enable it.
The Lab has been experiencing technical issues these last few days. Not sure why yet, but it's constantly running out of memory. Might be too popular!
I'll try to move it to a new server in the coming days, so don't worry if you see it offline for a while. I'm keeping it online for now but if it gets too unstable I'll switch it to maintenance mode.
- Edited
Darkle Have you restarted your nginx process to apply the config changes?
askvortsov Yes, I did a typical sudo systemctl restart nginx
Darkle are you using the nginx configuration provided by Flarum or your own? The lab only checks the javascript files for gzip compression. Your default configuration might only include images or other resources.
You can see in the Requests section which files were retrieved for the test. You can try accessing those files in your browser and check in the browser network tab whether they were compressed (looking at the headers).
clarkwinkelmann I was using my own nginx.conf to handle multiple sites and redirects on the same server, I checked the included with Flarum and the #Gzip compression
section had a lot more stuff, I copied it into my nginx.conf and it works perfectly. Thank you
If anyone also has their own nginx.conf file and is wondering what to include here it is:
# Gzip compression from /var/www/flarumES/.nginx.conf
gzip on;
gzip_comp_level 5;
gzip_min_length 256;
gzip_proxied any;
gzip_vary on;
gzip_types
application/atom+xml
application/javascript
application/json
application/ld+json
application/manifest+json
application/rss+xml
application/vnd.geo+json
application/vnd.ms-fontobject
application/x-font-ttf
application/x-web-app-manifest+json
application/xhtml+xml
application/xml
font/opentype
image/bmp
image/svg+xml
image/x-icon
text/cache-manifest
text/css
text/plain
text/vcard
text/vnd.rim.location.xloc
text/vtt
text/x-component
text/x-cross-domain-policy;
@clarkwinkelmann
The website is down since yesterday:
Error 500
Server error
clarkwinkelmann The Lab has been experiencing technical issues these last few days. Not sure why yet, but it's constantly running out of memory. Might be too popular!
I'll try to move it to a new server in the coming days, so don't worry if you see it offline for a while. I'm keeping it online for now but if it gets too unstable I'll switch it to maintenance mode.
My own monitoring shows intermittent availability, unfortunately I still haven't had time to look into it.
The Lab is back! Now on a more powerful DigitalOcean Droplet. I'll monitor the performance in the next few days and then move the other apps from the old droplet to that new droplet if permitting.
If you wish to help pay for those upgrades, you can use my DigitalOcean affiliate link or donate through Paypal.
To make some space and improve performance, I have deleted all existing scans. I have also deleted all records for websites that are no longer running Flarum but were still in the database with flag is_flarum=false
.
I have also rebuilt the extensions table from scratch, which should have gotten rid of any extension or release that was deleted from Packagist.
As a consequence of the server move all thumbnails on https://builtwithflarum.com/ have disappeared, but don't worry, they will be generated again within 2 weeks or whenever that website is manually scanned again.
All statistics and ignore lists should have been preserved, but let me know if anything disappeared.
Must be how my shared web host has things configured but your tool gives a 500 error when I try to scan my Flarums (two different ones, same host though). But, they are definitely up and running. I assume it’s the host and/or cloudflare? I remember using your tool with a Flarum installation I had with a different web host and it worked. I got an A rating. My sites show up within a Google search so Google bots can reach them. Just this tool can’t. Very weird. Very weird things tend to happen to me though so it’s OK.
- Edited
010101 can you send me the domain name or scan ID? (you can send privately via Discord if necessary, or via the email at the bottom of the lab's page)
I don't have any record of a 500 error since 3 hours ago. In fact I see no failed scans at all in the database so far. Do you mean the lab reports your own forum as having a 500 error? Maybe it's on a host that DigitalOcean somehow blacklisted and I can't reach it (?)
clarkwinkelmann Yes, I mean the lab says my forum is reporting a 500 error:
010101 never seen that before.
It doesn't seem to be related to any kind of IP or geo block. I can access your website fine from any of my IPs.
However any request made with the Guzzle PHP library to that URL causes a 500 error
$client = new \GuzzleHttp\Client()
$client->get('https://www.wilcosky.com/')
// GuzzleHttp\Exception\ServerException with message 'Server error: `GET https://www.wilcosky.com/` resulted in a `500 Internal Server Error` response
Looking at the body of the response, it seems to be an exception thrown by a Cloudflare Worker. The response includes "Worker threw exception" and "If you are the owner of this website: you should login to Cloudflare and check the error logs for www.wilcosky.com."
clarkwinkelmann Once again Winkelmann I.T. saves the day. That pointed me in the right direction. For anyone else, if your web host connects with Cloudflare they might do something like add a "Worker" in your Cloudflare settings. I found I really don’t need the worker and so I removed it, everything works the same, and I'm Flarum Lab Verified, A rating. Which is all that matters in this world. If you have a Flarum, plus at least a A rating at the lab, you're somebody.
Side note, I’m SO tired of hCaptcha. I had to go into different Cloudflare accounts multiple times and I’d typo my password so I’d have to solve the hCaptcha. If I have to tap on pictures of trucks or boats ONE MORE TIME, I might pop.