Every now and again I get super concerned about and involved with the webpage loading speed of my other website, the Underground Film Journal, which runs on WordPress, just like this site. Fast page speed is important for two reasons:
One, slow loading times turn off site visitors, who are more apt to click away before a page finishes loading if that loading is taking too long.
Two, search engines factor in page loading speed when ranking a website. The faster the loading time, the better the page rank.
So, while page speed is very important to a website’s health and functionality, there’s usually only so much time to devote to working on it before figuring, “Well, that’s good enough.” Because if I’m working on tech issues, then that’s time away from creating article content. And if there’s no article content, then there’s no site to improve the page speed of. Kind of a catch-22 situation.
Therefore, it’s a relief to find resources that let me fix problems quickly and easily. Over the years, I’ve already made many improvements to the site’s loading ability, from streamlining the HTML and CSS as much as possible to installing and optimizing the W3 Total Cache plugin to hosting the site’s images on another server and other tricks.
Recently, though, I went back to Google’s Page Speed developers tool, which is an excellent resource to figuring out the whys and hows and wherefores of slow site loading, because no matter how much work on improving page is done, there’s always more to improve. However, the problem with Google’s tool, though, is that while it clearly explains what a website’s problem’s are, the tool is terrible at making suggestions for fixes. I guess Google figures that’s my problem.
That’s why I was so excited to find the website GTmetrix, which analyzes a website’s page speed using the same configuration as Google’s Page Speed tool. But, what makes GTmetrix so awesome is that it gives very clear and easy to follow instructions and code snippets to fixing page speed problems. (The site also makes multiple and simultaneous analyses of webpage speed based on other tools, such as Yahoo’s YSlow.)
Here are two page speed issues I fixed recently:
1. LEVERAGE BROWSER CACHING
A website will want to store a bunch of site information in the visitor’s web browser so that the website will load faster upon repeat visits for that visitor. In other words, if a person visits a website a bunch of times, it’s better to store “information” — such as images — in that person’s web browser for future visits so that those same images don’t have to load from the site’s server every time. This provides a faster loading experience for the visitor and saves the website server from getting hammered so much.
However, at the same time, images shouldn’t be stored in a visitor’s web browser forever, so those images need an “expiration date” attached to them. This is what it means to “leverage browser caching.”
This is actually a pretty basic concept that I thought I was following, but I made a major boneheaded mistake about it. See, I always thought that the W3 Total Cache plugin automatically took care of expiring images on the Underground Film Journal, but what I didn’t think about is that I store most of my images on a different server, so that they don’t fall under the purview of the W3TC plugin. Oy!
So, I needed to manually set the “Expires Headers” for my images on that different server. Very happily, GTmetrix provided the code snippet for the Journal’s .htaccess file that I needed to make that happen. For the Journal, since I’m only caching images and some video, the code snippet looks like this:
## EXPIRES CACHING ##
ExpiresActive On
ExpiresByType image/jpg "access plus 1 month"
ExpiresByType image/jpeg "access plus 1 month"
ExpiresByType image/gif "access plus 1 month"
ExpiresByType image/png "access plus 1 month"
ExpiresByType video/mp4 "access plus 1 month"
ExpiresByType video/ogg "access plus 1 month"
ExpiresByType video/webm "access plus 1 month"
ExpiresByType image/x-icon "access plus 1 year"
ExpiresDefault "access plus 2 days"
## EXPIRES CACHING ##
I could also probably enable the caching to last for a year instead of one month since I rarely change my images, but at least I have some sort of expiration set.
2. ENABLE GZIP COMPRESSION
Gzip Compression is another concept I’ve always been aware of, but in this case, I never knew how to enable it. But, again, GTmetrix had another code snippet I could simply copy and paste into the Journal’s .htaccess file.
(By the way, if you’re someone who is inspired by all of this to start working on your own website’s page speed, please make sure you’re fairly comfortable editing an .htaccess file first because messing with that thing can screw up your entire website quickly.)
Gzip Compression is just way to make certain files smaller automatically so that they load faster in a browser. I only have one or two independent javascript files that would benefit from Gzip that aren’t covered by W3TC, but not compressing them was hurting my page speed in Google’s eyes, so best to enable this.
Since I know that my web server runs on Apache, this is the code snippet I used:
# compress JavaScript
AddOutputFilterByType DEFLATE application/javascript
AddOutputFilterByType DEFLATE application/x-javascript
# remove browser bugs
BrowserMatch ^Mozilla/4 gzip-only-text/html
BrowserMatch ^Mozilla/4\.0[678] no-gzip
BrowserMatch \bMSIE !no-gzip !gzip-only-text/html
Header append Vary User-Agent
In the end, those were two easy quick fixes to implement. I actually have a third fix that I want to write about, but I’m going to do that in another post because it’s a little bit more complicated and I went outside of GTmatrix to find the solution. Stay tuned!