Day 27 - Page Speed Optimization

Ryuzaki

お前はもう死んでいる
Moderator
BuSo Pro
Digital Strategist
Joined
Sep 3, 2014
Messages
6,126
Likes
12,740
Degree
9
1header.png


We've been discussing the importance of market research. We want to promote our new asset in front of the right people. We want to target the keywords that the right people in the right frame of mind actually use when they search.

Once we have these "right people" on our websites we'll also go through the trouble of split testing our copywriting, our website design, and even tweaking the steps of our entire sales funnel to make sure we're not leaving money on the table.

But what if the money never hits the table at all because your page took way too long to load?
It doesn't matter if page speed optimization is an advanced topic. It is fundamental to the satisfaction of every user and spider that lands on your site. You can't avoid dealing with it. No matter how amazing your VPS or dedicated server is and how much money you throw at upgrading it, it can't fix the absurdity of a bloated website.

If you de-bloat your site, you can likely get faster speeds on a decent shared server than you would on a VPS, all while reducing your overhead. You should only consider upgrading your server after you've dialed in the basics of page speed optimization.

2panic.png

The 'Don't Panic' Preamble

I know that we aren't all front-end and back-end developers and that's definitely not expected, let alone of someone who's new to the game. We're going to discuss the overall concept so you understand the main goals and then lay out some easy wins for everyone and even show you tools to help you achieve these goals. We'll put the intermediate and advanced methods in their own section so you can return at any point in the future when you're ready to tackle them, as well as reference the other resources on the forum listed at the bottom.

The only thing I ask of you is to read today's guide in its entirety. Do as much and understand as much as you can. Consider the rest exposure. As you continue learning, it will all click into place for you eventually. Just don't skip over it. A puzzle can never be complete if you leave the inconvenient pieces in the box.​

What is Page Speed and How Can We Affect It?

Definition:
Page speed refers to the time it takes for a browser to request a page load from a server until it is completely loaded. All of the way.

Misguidance Around the Web
You'll hear talk about making the user happy by ensuring the screen is painted fast even if the rest isn't loaded yet. You'll hear about not carrying about asynchronous items loading slower as long as the content above the fold is loaded and not jumping around the screen as the rest loads. Bologna.

We're not going for tricks and cheats. Yes, making the user happy is the primary goal but there are long-lasting and far-reaching secondary benefits to doing it right that will save and make you money. A trick doesn't reduce load and strain on your server. A cheat doesn't trick the brain of a search engine spider (because it doesn't have one... yet). It won't reduce your bandwidth and make your CDN allowance last longer either. No shortcuts for Builders!

3no.png

The True Goal
The overall concept of optimizing your page's loading time is very simple. It can basically be boiled down into these factors:
  • Asking for the least amount of items from the server, while...
  • Making those items as small as possible, while...
  • Telling the browser's to remember what they loaded, so you can...
  • Make the best use of your server resources.
That is the most basic version of the concept and goal. There's a few new tricks of the trade that we'll discuss too, but if you grasp that idea, you can push your optimization to the top 10% of the internet, without question.

Why Do We Care?

"Why should we do all of this? The sites my friends all share on Facebook take 7 seconds to load and crash my browser half of the time, and they don't seem to mind. Aren't they the average internet user?"

Remember our discussion about managing user's mind frames and levels of intent? Facebook AND those crappy sites your friends are sharing don't care about page speed because they don't care about conversions. They want impressions and they know that the clickbait is too strong for users in a zombie state of mind craving a string of quick entertainment fixes.

4instant.jpg

Users Want it, and They Want it Now!
For any mind state other than that of bottom-barrel entertainment, the user's do not want to delay their gratification, especially when they know 100 other sites can provide the same information, service, or product.

If someone wants to watch Salem and Netflix happens to be very slow that evening, they're going to browse their happy asses right on over to Hulu. And if Hulu is slow or doesn't have it, they'll check their Amazon Prime subscription they forgot about, because they value the speed and quality. But if Amazon doesn't carry it, they are going to end up at Google typing something like "Salem s1 stream."
Yay right? Your super crappy user experience on your ghetto streaming website with 800 pop-ups got a visitor and they accidentally clicked on a pop-under before deciding to drive down the street to the Redbox.

If Netflix had been running well, that entire journey wouldn't have happened. (This is why ISP's are throttling Netflix traffic, because they don't want to upgrade their infrastructure. They want to strong arm Netflix into paying extra fees, which would get pushed off to us, the user). That's a real world example of Page Speed Optimization having a huge affect on the United States with the whole net neutrality fiasco.

If an extremely wealthy person is on your site, zooted out of his gourd, ready to spontaneously buy a $10,000,000 yacht... the last thing you want is for your site to take an extra 600 milliseconds, let alone 6 seconds. If someone wants to know the difference between a princess cut diamond and a marquise diamond and that advertisement on your page will tell them for the low price of $20 per click, do you want them to bounce because you couldn't serve the information fast enough?

Lesson: You may not notice the benefits in the early days of your business' growth, but Amazon calculated that adding one second (one measly second) to their page speed would equal a loss of $1,600,000,000 a year.​

Search Engine Spiders are Judgmental Pricks

5spider.png

Every single metric a search engine can measure in regards to your site, they do. And they use these metrics to determine which site is the most high quality and least likely to be a spammer, SEO, or MFA site in camouflage. Guess what crappy sites who don't care about their users have? Really bad, super slow page load times.

When it comes to ranking for the big volume, high competition terms, this isn't what's going to make or break your place in the SERPs. But guess what happens to giant, established sites who shave an extra 500 milliseconds off of the page loading time sitewide? They notice an immediate boost in long-tail traffic for terms that they and nobody else is optimized for.

With a huge majority of searches being 100% unique and never typed before ever, and possibly never again, there's no way to do keyword research on them. Yet that's the largest cut of the traffic pie. It's what's left after you take out your meager keyword researched piece. You win that traffic by being better than everyone else in every way you can, and when it comes to long-tails, a lot of that boils down to page speed.

Lesson: "Someone has to rank for them... might as well be the fast site," says the search engine spider as he ventures across the web.

Before We Get Started - Measure Your Site!
If you don't measure your page speed before starting, you won't know what kind of progress you're making. Browse on over to Pingdom's tool: http://tools.pingdom.com/fpt/

I recommend measuring your homepage from the same location twice and keeping the second reading. This means Pingdom's browser will have cached the cacheable contents of your site. Measure against that as you continue to test, simply because you can't tell them to wipe their cache, or any other testing service. Do the same for a random inner page as well, such as a blog post.

Your results might look something like this:

6slow.png

That's a real world example of a popular website's homepage. Don't worry about the performance grade as some of it will be out of your control. You want to reduce the Requests, Load Time, and Page Size as low as possible.

Here's one of my homepages for comparison:

7fast.png

That's more like it. You almost run out of fingers counting the seconds waiting on the first site to load. Mine loads before your finger finishes clicking the mouse button.

You don't need to aim for the same levels of optimization as the second picture above. My suggestion for your first time around is to make sure your site loads in less than 2 seconds. Anything less than 1 seconds and you're doing great. If you happen to hit a lucky moment on your shared server and load in under 1 second, don't skip out on optimization. Run your tests at various times of the day and you'll see a true average of your loading time, and it won't be pretty.


The Fool-Proof Page Speed Wins
There are some methods you can employ on your website that'll increase your page speed that don't require you to know a single lick of HTML, CSS, PHP, or jQuery. Let's talk about those first. What we do need to assume is that you know how to use an FTP client or at least log into your server through your browser and upload some files. You have a website so I hope that's the case.

I'm also assuming that you're using a theme for your website that you purchased or downloaded somewhere. For some of these changes to persist through updates, you will want to create a child theme that will accept any changes from the parent theme as you update it for security reasons. If you know that this is the theme you'll be sticking with for the long-haul, it's wise to set that up so you can make changes. We discussed how to do this during Day 4. If you didn't do it, this would be a good time if you intend on doing any page speed optimization.

Shrink Your Images
There are three main problems with the images that you uploaded or that came with your theme:
  • They have a huge resolution yet are displayed much smaller
  • They are the wrong file types
  • They aren't compressed to drop out unused colors
Theme designers are often just that. They know what looks pretty and know how to get it there and that's all they care about. They don't know about this stuff any more than the average webmaster does, and that's why you're about to gain a competitive advantage.

8tiny.jpg

• Fixing Resolutions
Look at some of the images being used on the site. It might be different depending on your operating system, but right click on one and copy the image location. Paste it into the browser. Does it appear as the same size as it does on your website? Or is it five times as large?

There's zero reason to take an image that is 4000 x 3000 pixels and then display it on your site as 400 x 300 pixels. You're forcing the user to load ten times as much data as they need to! Look for and fix every instance of this problem. Some Wordpress themes will tell Wordpress to create various thumbnail sizes for you so that this problem never occurs. Maybe you got lucky. If you didn't, go through and resize these images, compress them, and re-upload them at the exact same location on your server or re-upload and re-associate them in your media organizer and visual text editor.

• File Types
There are three main image file types for the web these days: JPG, PNG, and GIF... This differences are related to how well they print onto paper or show up on the screen. Don't worry about paper. With that being said and with the recent advancements in compression algorithms, JPG and PNG can be considered equal. I prefer to use PNG's because I usually include text and watermarks and they seem to hold up sharper with less jitter. They both display sharp enough on the screen.

GIF, in comparison, is a very lossy file format. It was designed to require the least amount of storage possible by reducing images down to a smaller set of colors. When images are small in resolution, such as icons, or are purposefully blurry such as background images, you can take advantage of converting them into GIFs to save storage and bandwidth.

Go through your theme itself and look at the images included in its design. Are there background textures, small icons (25 x 25 and smaller), or other visually unimportant images that can be converted to GIFs? The savings will be drastic and sitewide. Every page on your site will benefit from these changes. This is also why you need to create a child theme, so that you won't overwrite them if you update your theme in the future.

• Image Compression
With zero exaggeration, this is by far the easiest place to speed up your site and one of the most impactful. If you do nothing but this, you'll notice a strong difference.

You know the two page speed images above? Here's a screenshot from when I optimized them using www.TinyPng.com:

9png.png

TinyPng will compress PNG's and JPG's! Another option is Kraken, but I prefer TinyPng's algorithm. Notice that both images were compressed by an average of 63.5%. Imagine doing that to every single image on your website. You can likely cut your loading time by a third if not by a half, depending on the number of images you use, just by compressing them.

Once you resize all of your images and convert the small ones to GIFs, you can mass-compress them together. Keep the file structure the same and zip the folder when you're done. Reupload it and unzip it and you'll replace all of your images with your new, smaller versions. Be careful if you're unsure of what you're doing. Take a backup of your site first if you aren't sure.


The above methods will reduce the amount of data that your server has to send to the user's browser.

Every single request to your own server is linear with the current structure of the internet, unless you dig into some advanced methods later. The principle holds true even then for the most part. You can only serve one item at a time and until it finishes, the next one can't start. That's why making them as small as possible is a big deal. The faster it flies through the pipes, the sooner the next piece can start.


Shed, Combine, & Minify
Now let me show you how to make some other items smaller, get rid of some requests by combining items, and simply getting rid of some altogether.

10shed.png

Plugins
We've been assuming that most people are using a CMS of some sort and that means there's a plethora of plugins available to make your life easier. Even as a developer, I still use my preferred CMS and three plugins that I trust will continue to be updated (because the creators make so much money off of it). I build my own themes, but some of those plugins are far too tasty to avoid.

It's even worse for a newcomer or someone who doesn't understand the damage these plugins can do. They are fun to browse and imagine how easy it would be to add new features to your site. Before you know it, you're up to 700+ requests like the site above and loading almost 5 MB of data. And you're probably not even using the plugins.

With extreme prejudice, go look at your list of plugins and determine which:
  • I'm not using but it's activated
  • I'm not using and it's deactivated
  • I could live without
  • I can find a better way of doing it's job
If the plugin falls under any of those four categories, then it's not crucial to your website. Deactivate it and delete it. No mercy, no remorse. Now go re-measure your homepage and inner page. I can guarantee you dropped requests and it's faster.

That's because plugins have to exist in isolation as self-packaged monsters that eat time. They are like Stephen King's Langoliers. In order to be generalized enough to slide into anyone's website and work their magic, they have to deploy all manner of images, CSS sheets, JS files, and even get into and bloat your database. Get rid of 'em!

Theme Options
Remember when you chose that theme to use because it looked awesome? It had that giant theme options panel where you could tweak every setting without touching the code? You dun goof'd.

Ask yourself again, "is it critical?" as you go through the options. Does the theme let you use as many Google Fonts as you'd like? Use no more than two, preferably one. Does it let you activate carousels, interesting animations, and other things that move around? Kill them off. That's extra jQuery you don't need. Keep going through the list and see what you can disable.

After you do, re-test your site at Pingdom again. Did you drop more requests and page size? If you didn't, then your theme is "calling" these files and options even if you're not using them. I'd change themes just on principal alone, but if you're attached to it, as a developer friend to dig into the PHP files and get rid of those requests. Or figure it out yourself because...

Combine and Minify CSS & JS
Once you're done trimming the fat, you can take the remaining CSS files and combine them all into one file instead of several. You can do the same with the JS files (javascript and jquery). But make sure that all calls to those files get switched over to the new file or you'll have broken aspects of your site.

Once you've done that and confirmed everything is working fine, open up the files and browse on over to: CSSMinifier and Javascript-Minifier .

Here's what's going to happen. You're going to paste your CSS text into the left side of CSSMinifier and it's going to remove every bit of whitespace and most of the comments. All this does is shrink your file size by a decent amount. You're going to do the same with your javascript, and it'll create new variables, cut out white space, and shrink your file size.

Here's an example of CSS minification:

11min.png


That dropped the file size from 21kb to 16kb. The more CSS you have, the greater the savings. The bigger benefit is that you're now loading one CSS file instead of four.

You see the pattern here? All we want is least amount of files that are as small as possible to the job we need. That's the majority of page speed optimization.

Sometimes you can't combine external resources from plugins with your main resources without the skill set and time to deal with it. And then that reduces a lot of flexibility for the future. If you're on Wordpress, check out the BWP Minify plugin. It will take all of these files, maintain their loading order, and then minify them for you. That includes CSS & JS. As I warned before, don't accept this solution if you can do it otherwise. You're adding operations to remove operations when you could have a higher net gain doing it manually.


Seal the Deal with Caching
You've come a long way. If you were thorough and ruthless in combining the good and shedding the nonsense, your site has made some serious progress. Every page is now loading faster thanks to these sitewide decisions you've made. Now we're going to replace your old two-stroke fossil fuel with plasma ions.

ROM vs. RAM (or as the kids call it these days, storage and memory). Caching tells your server and your user's browser to quit asking it for the same stuff over and over again. It says "Hey, I've sent you style sheet already. Pull it out of your memory, dummy," and "Look, I already trudged through the database to find all comments for that page from this date to that date, I'm not doing it again. Use your short-term memory."

Browser Caching
Every website can have or already has an invisible file called .htaccess, where the period at the front tells your file system to hide it from you so you don't screw it up. You need to become familiar with this file as a webmaster because it controls a lot of the behavior or your site and of your user's browser in regards to your site.

You'll need to tell your FTP client or cPanel or whatever file system you're using to allow you to see hidden files. For cPanel, view the two steps below to enable dotfiles to be viewed in your file manager.

12file.png

Edit the .htaccess file found in the root level of your public_html, Wordpress install folder, or where ever it is exists based on how you installed your site on Day 4.

Paste the following chunk of code into it at the top or bottom without changing other information there:

Code:
## EXPIRES CACHING ##
<IfModule mod_expires.c>
ExpiresActive On
ExpiresByType image/jpg "access 1 year"
ExpiresByType image/jpeg "access 1 year"
ExpiresByType image/gif "access 1 year"
ExpiresByType image/png "access 1 year"
ExpiresByType text/css "access 1 month"
ExpiresByType text/html "access 1 month"
ExpiresByType application/pdf "access 1 month"
ExpiresByType text/x-javascript "access 1 month"
ExpiresByType application/x-shockwave-flash "access 1 month"
ExpiresByType image/x-icon "access 1 year"
ExpiresDefault "access 1 month"
</IfModule>
## EXPIRES CACHING ##

What this does is not only tells the browser to cache the file types listed, but tells them when to expire them as well. That means that as long as the visitor continues to visit your site, whether that be viewing multiple pages in one session or coming back daily to read your blog, it will hold a copy of the files and load them from memory instead of requesting them from your server again.

Beware, if you change the file, the new version won't load if it's still cached. You need to change the file name or tag on a dynamic variable such as... image.png?v2 where ?v2 is the dynamic tag. Double beware, files with dynamic variables can't be cached! What a cruel world.

Server-Side Caching
If you're not using a CMS, don't bother with this unless you have a managed server and can't get the host's support to do it for you. If you're using a CMS such as Wordpress, Drupal, or Joomla, there's going to be a server-side caching plugin available. It will have default settings that I don't suggest you change unless you know what you're doing.

What this does is cache static versions of all of your dynamic pages that your server and send to the browser instead of re-pulling all of the information from the database every time. It will go as far as to store individual objects such as dates, comments, blog posts, meta data, and more. The less time your server spends running through the database to compile pages, the faster your user can start viewing it.

13thanks.jpg


That's it! As a beginner with minimal knowledge of scripting and display languages, server set-ups, and other nuances of the game, you still managed to optimize your site better than everyone who never knew to try and most of those who did try! The adventure doesn't stop there by any means. Return to this guide in the future when you're ready to really push the limit and investigate the intermediate and advanced suggestions below.​

One more word before we move into the more complicated methods...

When Do I Need to Upgrade My Server?
Earlier, I was discouraging you from running off and using money as a method to increase your page speed, because that's not how it works at the start. Server resources can only help a bloated website to a degree. However, once you've done all of the beginner methods above and you're not happy, then you might consider an upgrade.

For instance, if you cut the number of requests drastically along with the total page size and you're still loading slower than molasses, then you're in a position to determine if it's a server problem.

You're more than likely on a shared server right now, which is smart. You don't need an entire server dedicated to your site or even portions of one set aside for your site only, unless you're getting a crazy amount of traffic and noticing it bogging down. The problem with shared servers too though, is that unless you're monitoring it all of the time, you don't know if some other successful website is eating up all of the resources just when you need them the most.

First, determine if your TTFB sucks. That's "Time To First Byte." You can check it: ByteCheck.com

For instance, BuilderSociety.com just registered a 0.139s TTFB for me. My own site from the page speed image above showed a 0.105s TTFB. The horribly optimized site above showed a 0.362s TTFB.

This is a solid indication that the site isn't getting the resources it needs, either because of traffic demands or because it's not optimized well or because it's on a crappy shared server with 1000 other sites. Either way, the server isn't getting the job done like it should.

To put it in perspective, the crappy optimized site's server didn't even send a single byte before my entire homepage had loaded.
You can also log into cPanel or WHM or whatever management portal you have and look at the server resources. You'll be looking at RAM in either a percentage or a number 0.00 to 1.00, where the lower is better. That's how much of your available RAM is being used. You can look at the same strain on the CPU.

If you've done all you reasonably can with speed optimization and you're still getting high resource usage with a high TTFB, then I'd consider upgrading to a VPS. BuilderSociety uses Knownhost and I've been using them for my personal sites before I joined this forum. (Use the coupon code BUSO15 on a VPS-2 or higher, or BUSOSSD for an SSD-2 or higher and get 15% off for the lifetime of your rental.)

14fast.png

Intermediate & Advanced Methods
There's a lot more you can do to make sure your users and the search engine spiders have the fastest, most well-greased experience on your site. But it requires a little more knowledge about front-end development and other technical skills. If you're not ready, don't open the can of worms.

But if you know how to navigate through the ocean of terminology, coding languages, registrars, nameservers, and the rest, then you're in luck. I'm going to introduce you to some more topics briefly and then point you to other resources on the forum where they have explored in much fuller depth. Apply them all and your site can be loading in half a second or less.

Keep Reducing HTTP Requests
You've probably hacked away at unnecessary features on your site like a neckbeard on a jungle exploration seeking the Pyramid of Speed Loading. But you can't get rid of anything else. They are non-negotiable. That doesn't mean you can't trim the number of requests still while keeping these items!

Do It With CSS!
The axiom is "If it can be done with CSS, do it." Don't go overboard and become one of those guys drawing every image with CSS. Look for simple sitewide wins. For instance, did your theme designer use 10 different icon images to display all of the social networking logos? Forget loading 10 images or even just one. Do it with CSS! There are fonts out there where each letter is a social networking logo. Install that, style it, color it, use it. You've trimmed 10 image requests into 1 font request and saved at least 50kb of bandwidth.

The same goes for gradients, shadows, lines, and other small graphics. Don't load a 0.5 kb vertical slice of a gradient and then run it horizontal with an x-repeat... just draw it with CSS. There's an entire school of web designers that start in Photoshop and slice the final product into HTML and CSS. They end up using a ton of images where CSS itself would have sufficed. When all you have is a hammer, everything looks like a nail... your toolbox is bigger. Focus on CSS and get rid of those images.
Okay, so you converted all possible images into CSS and reduced your load size and number of requests some more. Feels good. But there's still a ton of pictures that weren't worth converting into CSS, but load on every single page. Let's fix that.

CSS Sprites
Imagine a scenario where your header contains your logo, two security seal, a magnification glass icon, along with the smaller version of your logo and a picture of your face in the footer. That's six requests that can become one, all through the magic of CSS Sprites.

A sprite is a handful of smaller images all loaded together on one larger image, like you see here:

15sprite.png


You might save some some bandwidth or lose some, but you'll replace five images with one, such as this example. Once the connection is open between the server and browser, it's faster to keep it open and load a little more data than it is to keep opening and closing the connection. If you can find sitewide images like this to combine, you can get massive savings for every single page.

At this point, you use CSS to load the images off of the one picture by defining the images size and their starting locations using Cartesian Coordinates. It sounds confusing but it's not. @turbin3 recommended this tool that I found useful myself. It will generate the coordinates for you as you drag and drop your images around.
De-Bloat Your Database
If you're using a database, then it's probably bloated if you aren't maintaining it's cleanliness. It's probably chock full of post revisions, spam comments, taxonomies you aren't using, user accounts, and more.

Danger: Never mess with your database unless you've taken a full backup of your site first.
I wouldn't attempt to do this by hand, personally. If you're using a database, chances are 99.9% that you're also using a CMS. If you're using a CMS it's likely a popular one that has a plugin that will do this for you. Still, backup your site first and then make sure it's a plugin that's trusted and does the job correctly. You don't want surprises later on after you've over-written the last good backup.

For Wordpress, I trust and use WP-Optimize (and it's free).

Prioritize the Above-The-Fold Rendering First
Remember how I said doing only this was a trick and a cheat? Well, once you've done the other stuff, there's nothing wrong with changing your user's perception of how fast the page loads! If above-the-fold is fully painted and the rest is loading while they are getting their bearings, then all the more power to you.

To pull this off, do the following three things:
  • Set up a fallback system font that can pre-load before your webfonts
  • Inline the CSS of sitewide above-the-fold elements into the HTML of your header
  • Move render-blocking javascript to the footer
That's it. With a system font for OSX, Linux, Windows, etc., the font can pop on screen immediately without waiting on Google Fonts or Adobe TypeKit, etc. With the CSS for sitewide elements inlined into the HTML, they can load before the CSS stylesheet is loaded. And finally, if your JS files are in the footer, everything else can happen before they are encountered linearly by the browser. That means the server will send them last.

Use a Content Delivery Network
CDN's serve one main purpose. They load up stuff like your CSS file, JS files, and all of your images and distribute them to their servers around the world. This reduces strain on your main server and shortens the path your data travels to customers.

16cdn.png


Imagine that your server is physically located in New York and you don't use a CDN. Even though this data travels at the speed of light, it's not a straight shot anywhere. It has to jump through different nodes and be re-routed and all kinds of technical details I don't know about. Without a CDN, the guy in Texas is going to experience a faster page load on your site than the guy in California.

However, your CDN supplier might have a server in New York, Texas, California, England, China, Japan, Australia, and South Africa. And when someone from Russia lands on your site, they aren't going to wait for your data to crawl all the way across the ocean from New York. They're going to get it from the server in China, incredibly fast.

That's what a CDN does. Of course, others try to offer you more features like DDOS protection and other things you don't need to worry about. What you want is that distribution of servers around the locations that your user's come from.


Conclusion

Sure, this is getting a bit a more technical, but with the tips above even the uninitiated can cut their loading time significantly to produce a direct positive impact on their revenue.

The stakes get higher and higher as does your traffic volume and value. You'd be a fool to not at least reap the benefits of the easy optimization once your site is earning.

Watch out though! This is an addictive game. Get it good enough. Much like redesigning your site over and over, it can become a trap. Good enough is good enough. At a point, it will be out of your control as display ads and other elements force extra resources to load. Everything's a trade off, and the conversation will always come back down to one word...

Optimize!


Additional Day 27 Study Materials:

 
I just discovered while trying to do this that Kraken will maintain your folder structure. So you can zip and download your entire wp-contents, run it through Kraken unzipped, get the result, re-zip it, and re-upload it. Now you've got a backup too in case your upload messes up.

I'm torn because I do see that TinyPng is shaving off more of the file size without any visible difference. I didn't want to drag and drop 25 images at a time in there. I eventually found their plugin for Wordpress, where I can run through my media part of the dashboard and click to compress the old images (and it automatically compresses the new ones!). The developer API key is free but there's a 500 limit per month. If you want to bulk compress, you'll have to pay $0.009 per image until you hit 10,000 and it drops to $0.002. So you could bulk run 5,000 images for $40.50. Totally worth it to just crunch your whole site, especially if your resolutions are already correct.

On a big site, I can imagine this change alone could alter your life money wise with the extra traffic you'll bring in...

I'm trying to find the easiest, quickest wins from your list to do first (I can't do all the coding stuff yet).

Seems like just the following could make a big difference for anyone's sites:
  1. Compress pictures
  2. Set up Browser caching
  3. Set up a Server caching plugin
  4. Use a CDN
Those are all almost entirely plugin based and easy to do and can all be done for free.
 
Totally worth it to just crunch your whole site, especially if your resolutions are already correct.

The only potential issue I see with this is with anyone using pre-built themes that generate as few as 3 or as many as 10 different image sizes per image you upload. Wordpress literally creates that many versions of your file, even if they aren't used within your theme. This is something theme developers set up, often sloppily with no regards to page speed.

Just be aware that you may be paying to compress 100,000 images when you only needed 20,000. It's still going to be worth the money to not spend the time trying to sort it out. But you'll continue to pay for it as you hit your 500 images a month limit with all the unused duplicates. The plugin will compress all versions, used or not.

Correction: @Vert pointed out that you can browse in the Dashboard to Settings > Media >
PNG and JPEG compression... and use the checkboxes to choose which sizes you want to automatically compress or not.

I'm trying to find the easiest, quickest wins from your list to do first

These are definitely the quickest and easiest wins, but remember the two core principals of page speed optimization:

1) Cut down how much data you're sending to the browser from the server (what you're doing)
2) Cut down the number of sequential connections (HTTP Requests) you're making by combining data into less files or getting rid of them.

The second point offers just as many benefits if not more because it helps you achieve the first point as you go. But yes, it's a little more technical.

I agree with your list, and that's the order that I'd go after it as well, if my skill set was limited. I'd also use this minifying plugin I mentioned. It's a crutch, but if you need one, you need one. It gets the job done okay. Make sure it plays with your server-side caching properly though.
 
What would you recommend doing about admin-ajax.php on WordPress? I swear I've tried everything and it continues to kill my load times.
 
@RomesFall, I've never encountered a problem regarding this because I shy away from Ajax on the front-end. Sounds like it's a specific plugin you're using that's abusing the Heartbeat API. It shoots data back to the browser so it can load data in specific sections through jQuery without having to refresh the entire page.

You can use this plugin Query Monitor to determine exactly what is the culprit, or use GTMetrix to look at the Post headers to determine what's hitting it, like here:

aMedkT0.png


In this case a guy had a plugin called WPTouch that was screwy from the developer's side.

I'd identify the problem and get rid of the plugin or whatever code you added, but if you must keep it, you can control the frequency at which the Heartbeat API can be pinged. It can be done through a plugin such as Heartbeat Control or you can write a manual function to do it.

Constant pinging of this API can eat up CPU resources and continually hold data in RAM and possibly attempt to refresh it on each ping. It's also additional requests and waiting on the server to load things that aren't even being displayed the first time around.
 
Unfortunately, it would seem a lot of themes and plugins are now starting to utilize admin-ajax.php to an increasing degree. They usually utilize them to perform real time functions, to parse and save or return things in real time, for data logging, and a number of other things. For example, Thrive Themes' plugin, Thrive Leads, offers a lot of lead gen features, including impression tracking of users' activity around and with your lead gen forms. It accomplishes this through using admin-ajax.php to monitor those lead gen forms and users' activities with them, then saving that data on the backend, so you can see the insights in your dashboard.

The Heartbeat Control plugin helped a little bit. I usually set it to 60 seconds, which is the current maximum. That being said, admin-ajax.php is unfortunately an example of the increasing bloat present in Wordpress by default.
 
Checking Caching of Webpage Elements

When doing PageSpeed optimization at times you have to setup caching settings through .htaccess (apache) or your site's conf file on NGINX in order to leverage browser caching correctly.

What is caching? Basically if a user visits your homepage and your website has your logo for example, then they visit multiple other pages, as the user your browser does not need to keep asking for the same logo file over and over, therefore making the experience faster.

I found this tool to help you see your webpage's caching elements, what type of caching and most importantly the length the cache for that element is set to: Cache Checker: https://www.giftofspeed.com/cache-checker/

I ran it on one of my web pages I'm developing and it spit out this detailed info:

R54aP08.jpg


^^ When you have elements which are not cached they are separated for you so you can deal with them and then recheck with the tool. It's pretty neat.
 
@Ryuzaki First off thanks for the great guide. I have had some great success getting my pages to load faster. While work on this, I ran some test on Google PageSpeed insights and one of the suggestions was to enable Gzip compression.
After some searching I found some code to add to .htaccess. After running a test on Google as well as pingdom I did see a huge gain in speed and reduction in file size. The WP theme is really bloated. I used a couple of different browsers and did not notice anything funky going on.

My question: Is there a reason to use or not use Gzip when working with a stock WP theme? If you recommend using it, are there any known issues I should look out for?

I would prefer to use a theme which was written well, but in this instance I am handcuffed to my client's needs/wants.
 
I didn't specifically mention Gzip, because in my mind it was implied in the portion on server caching. I just re-read it and it's definitely not obvious to the reader, so thanks for bringing it up.

When it comes to Wordpress sever-side caching, I recommend WP Super Cache, a plugin developed and maintained by the Automattic team themselves (official Wordpress stuff). If you use that, it will automatically apply the proper .htaccess code to enable Gzipping only in relation to the cached files.

If you aren't using Wordpress, you can find simple copy and paste material for Apache and Nginx servers online that will Gzip all of the appropriate file types.

Gzip Explained
For those who aren't aware, Gzip is exactly what it sounds like. It's a compression method to reduce the amount of data being sent from the server to the user's browser. This saves you on bandwidth costs and increases your page speed. Every modern browser is equipped to decompress Gzip content.

The browser sends an HTTP Header saying that it can Accept-Encoding: gzip and if your server is set up and has it ready, it will respond with Content-Encoding: gzip.

The basic conceptual idea is that it will take redundant expressions in HTML for example, and reduce them to variables. So something like:
HTML:
<div>Once upon a time...</div>
<div>Once I had gone to the store on time...</div>
Might be reduced to:
HTML:
1xt upon a 9cR
1xt I had gone to the store on 9cR
Conceptually it's a simple replacement process, cramming longer repeated items into smaller variables. I'm not sure how many times it runs through either but I assume more than once, packing variables into variables. This is different than minifying HTML & CSS but similar to Javascript.

Here's an example of one of my own pages I just ran:

17gzip.png


That's a substantial savings, as @JasonSc pointed out. 81%... I'll take it! You can run this same test at: http://www.gidnetwork.com/tools/gzip-test.php

If you're using Wordpress, let WP Super Cache do this for you along with other caching work. It will set up pre-compression and expiry dates and more to save you a ton of server load. If you embark upon doing this yourself, make sure you really take the time to understand your options. A wrong configuration can save you in one area at a trade off in another...
 
Just tested out this new code that allows you to load CSS without blocking rendering:

Code:
<link rel="stylesheet" href="css.css" media="none" onload="if(media!='all')media='all'">
<noscript><link rel="stylesheet" href="css.css"></noscript>

Source: http://keithclark.co.uk/articles/loading-css-without-blocking-render/

Allowed me to get 100/100 with Desktop. Mobile is so full of it, it's like come on dudes...

wcICD6B.jpg
 
Question @Ryuzaki , If I'm using a theme and initially start with an import that includes dozens of pages. Does reducing the amount of dummy pages speed up my site? I currently have countless pages that were included in the import, and I was wondering if I got rid of them, whether or not that would impact my site/page speed.
 
@Richard, initially I'd say no, in the grand scheme of things. Those are pages set up in the database. And even with dozens... let's say 36, that's not going to bloat your database too much, relatively speaking. If you want to cover all of your bases then I see no reason not to delete them. But 36 dummy pages in a database of 1000+ posts is a drop in the bucket.

In my case, being anal retentive, I'd definitely delete them and any images uploaded to their posts associated only with those posts, partially for speed and mainly for database and server organization and future storage considerations. The gain to speed would be negligible but I have to keep a clean house.

To get a real world idea of how much this would impact your site, check out WP-Optimize, mentioned above and assuming you're talking about Wordpress. Use it to clear out post auto-saves, post revisions, trashed comments & spam, and anything else you see floating about. Do 3 before measurements and 3 after measurements and average them out and you'll get an idea of the savings. Of course it depends on how much you delete out of the database. It can definitely add up, especially if you're not using server caching.
 
Expert Methods
The Crash Course is meant for beginners, but many of you reading this have gone on to become intermediate and even expert level developers and marketers. And if that's in part due to the Crash Course then it's likely you return to it to refresh your memory.

There's far more for page speed that can be done than was ever mentioned in the opening post. While I still need to hold some cards close to my chest, there's a lot I can tell you that can put the finishing touches on your speed.

As always, these are quick wins that anybody can use if they're willing to take a bit of time to use them. This isn't meant to make you an expert, only to get you expert results with minimal effort.

PHP Version
First and foremost, let me tell you something that'll possibly make you rage. Lot's of hosting companies are still rolling out servers on PHP 5.6. You'd think if you have them spin up a brand new VPS for you that you'd get placed on the newest tech. Nope. Servers you already are renting aren't being updated either.

Why does this happen? Because they don't know what kind of crap you're going to install or already have installed on your server. PHP 5.6 is way old now, but it's safe. Even the latest adopters and stragglers have updated their plugins, themes, and CMS's to PHP 5.6 by now.

And though it's been literal years since PHP 7.0 and even beyond were released, server companies know that a lot of you use the whackiest plugins that have been abandoned for 5 years and by upgrading your PHP they'll break your site. So they simply don't touch it, and you never know any better.

18php.png

Take a gander at that. For the new tier of Wordpress, moving to PHP 7.3 almost three times as fast as PHP 5.6. And I bet you're on PHP 5.6.

And I bet you didn't know you can change it yourself in about 30 seconds in WHM or cPanel. If you don't have either of those, the company's custom dashboard probably lets you change it. Otherwise you can just email them.

But before you do that, it's critical that you check all of your plugins and make sure they're compatible with this newer PHP version. Otherwise you're going to have a bad day.

This is a huge benefit to you. There are users that never hit your cache. There are things that can't be cached. There's cache generation. There's reading and writing to the database. There are a lot of PHP operations that this will improve drastically. You'll feel it most while you work within the Wordpress dashboard, but there's a class of users of yours that will enjoy the benefits too.

HTTP/2
This is easily the biggest thing you can do for your page speed, and that is to migrate from HTTP/1 to HTTP/2. Again, any legacy server you've been renting for a long time will not have made this jump for you. The reason is it requires updating different infrastructure things beyond just flipping a switch, things that can break redirects as an example.

You won't be able to do this yourself, but you can ask your host to do it for you. It should take no more than 30 minutes to enable HTTP/2 and then browse around your backend and frontend to make sure everything is fine.

Why does this matter? Because on HTTP/1, when the user's browser requests 25 files, your server will grab the first one and deliver it. Once that's done, it will go grab the second file and deliver it... and so on. That looks like this:

19one.png

This is the timing waterfall showing how files are downloaded from the server to your browser. Notice how one file connects, waits for the server, and downloads, and only then does the next one start. One. at. a. time.

On HTTP/2, the user's browser asks for 25 http requests and your server grabs all 25 of them and spits them over at once at the same time:

20two.png

I don't even need to explain how advantageous this is. It's the difference between ordering a meal in a restaurant and the server bringing one item at a time, running back to the kitchen for each one, versus bringing your entree and two sides and drink all at once on one platter. You wouldn't tolerate the one-at-a-time method in a restaurant but that's probably how you're serving your users on your website.

There are some implications I'll explain below that will "undo" some of your previous knowledge and make your site even faster still. Both of the sections below require you have your server configured for HTTP/2.

Server Pushing
This is extremely complicated and it took me a long time to sort it out. For that reason I'm not going to just dump code on you. There's a risk that it'll require some tweaks and since it has to go in your .htaccess file there's a huge chance you could take your entire site down.

But I'll explain what this is in case you want to go on the adventure yourself. On HTTP/2 we gain the capability of server pushing, which goes like this.

Imagine you're a regular in that restaurant I mentioned above. You always order a water, a diet coke, and a salad before your meal. Your meal is always different, but those first 3 items never change and always come first. You won't even think about eating your meal (rendering the page) before you have those 3 items.

Instead of the waiter doing the same song and dance every time, asking you what you want and you go through the script of ordering those 3 items, they save time by just bringing it out to you as soon as you're seated. That's what server pushing is.

In "http request" terms, it's like this. The user's browser requests the HTTP page and all the files needed within it. You as the website owner know that, no matter what page they land on, they will always need your CSS file, your jQuery file, and your jQuery-Migrate file. Rather than waiting on them to parse the HTML and discover they need that, you go ahead and send it to them before they even ask for it or anything else.

You're choosing to send them the render-critical files, the ones that have to be used before anything else can possibly show up on the page. In some cases this may only save you 100 ms. In some cases it could be a third of a second. It looks like this:

21three.png

The top line shows the usual fiasco. The user is asking for the source code of the page. It does the SSL handshake, makes the connection, waits for the file to be retrieved, downloads it, then parses it.

Normally after that we have to do the exact same nonsense again to get the next three files, but notice that they only show a white section and then the blue. There's none of the extra "handshake" stuff because you're spitting these three files to the browser before it even asks for the next files (which if you notice on the 4th line has to go through all the connecting and waiting again).

If you can figure out which files to push, you can cut a significant chunk of time off your site. If you're a dummy with a million plugins and a theme that has a million CSS and JS files, this won't do you any good. If you try to push too many files, you can actually make your loading time worse. And if you use a CDN then you'll have to work with them to make this happen on their side too.

I'd call this a very advanced method that is only worth doing if you've covered every other base and have minimal render-blocking resources to send, and even then the results are "semi-negligible/semi-significant." If you decide not to mess with this, I wouldn't blame you and it won't make or break your speed.

Don't Concatenate
Finally, let me "undo" some of what I told you before. We're only going back on the previous teachings if and only if you're on HTTP/2. If you're not, then this is bad advice.

Before hand on HTTP/1, I told you to do things like combine all of your CSS files into one, combine your JS files into one, make CSS sprite sheets for images, etc. This is because when you get files one at a time you have to connect and wait and download over and over again. So it's better to do that one time then download a much larger file than to do the dance over and over again.

Because HTTP/2 lets you spit out so many files at once, you're only doing the "connect and download" dance once. So it's more beneficial to keep your files separated. It's better to send 3 smaller CSS files than 1 giant one, because the browser can download them at the same time, parse them, and render in basically a 1/3rd of the time.

So for you guys that got a bulky theme and too many plugins that you're married to now, rather than using Autoptimize and other tricks to concatenate, you're far better off not doing that and getting your server set up for HTTP/2.

Lazy Loading
Let me move away from HTTP/2 stuff now and mention Lazy Loading. I never talked about this before because it's doesn't really impact page speed much if you're doing the rest of the things we've mentioned in the OP and in this post. You'll have already prioritized "above the fold" rendering.

I talked above about not using tricks and cutting corners, which is part of the reason I never mentioned Lazy Loading. What it does is DEFER the loading of images and iframes (be careful here, it can mess up your display ads). Defer means that it flat out won't load your images until last, but with Lazy Loading it doesn't just wait till last, it waits until the image is about to appear in the user's browser viewport. It measures this in many ways, one of which is some code called the Intersection Observer.

The benefit here is that you may have some images above the fold that you can get out of the way of loading so the rest of the page can hurry and load. You want your CSS layout and your text on the page ASAP. Another benefit is you'll save your server a ton of resource usage. Your RAM and CPU usage will go down but most especially your bandwidth usage. Because if a user only scrolls halfway down the page, they never need to load the images on the bottom half and won't.

I urge you to understand exactly how this works, with Javascript, Noscript, Intersection Observer, etc. I won't dig into all of that here. I'll just point you to the best plugin for getting this done (and it's free) which is WP Rocket's Lazy Load.

I also urge you to not lazy load iFrames if you run any kind of ads. I'd also tell you to learn about the CSS classes you can add to an image to make sure it does load instead of being lazy loaded. You can also figure out some CSS to make the images fade in, which is classy and doesn't jar the user.

Understanding What Matters
In the opening lesson, I made some decisions about what to include and what to exclude. The danger of telling people "you can only worry about this one aspect" means they'll stop learning right at that threshold. I'd have been doing you a disservice at the time, which is why some of this stuff was left out.

Another thing I left out was explaining exactly what matters. Let me break that down into two areas:

For the People
What I emphasized in the original post was optimizing your entire page. It's good for the people. It's good for their mobile phone bandwidth limits. It's good to keep their browsers from bogging down. And it keeps them from bouncing from your site and not giving you money.

I showed you an easy way of how you can measure the entire data load of your site, how many HTTP requests are coming through, and how long it takes to render the whole page. That's still useful. Pingdom is one method and there's tons of others like GTMetrix and WebPageTest.org. They all also show you a waterfall, which you can access on your own using your browser's developer tools (disable cache when you do this).

I don't want to harp any more on "for the people." We tried to stick to that angle when we wrote the Crash Course, but let's be realistic too. Tons of us are SEO's. What we didn't and don't want is for everyone to think like SEO's, we want newcomers to think like Marketers (of which SEO is only a part). But let me add the SEO stuff here, because it's important to all of us.

For the Robots
This is the last time I'll say this: Please optimize for the people, too. For SEO, your main concern is optimizing for the robots, which is to basically say the Google spiders. They're now rendering pages and taking page speed measurements as well as taking some from Chrome browser users. So, as a sneaky SEO, the question becomes how do you trick them out to get better scores? You don't, you do some real work.

Any method above, like inlining CSS in the <head>, making JS asynchronous where possible, server pushing, etc... all of that will help with this. What is "this?" It's getting the above-the-fold rendered as fast as you can.

If you want to know how you're doing, your best bet is to run your page through Google PageSpeed Insights. It will give you both a Mobile score (yours is probably pretty bad, don't panic) and a Desktop score (which will be much better).

It'll will also audit your page and tell you what you're doing right and what you're doing wrong. I'll show you an example from one of my websites:

22me.png

That's almost a perfect score on Desktop. On Mobile I get a 94. I'm not showing this to brag but share with you what is possible if you apply everything in this day of the Crash Course. Of course some of it requires building your own theme with speed in mind and coding your own solutions rather than using plugins. But you can learn all of that too, if you have the gusto.

Rather than me type all of this out, this screen shot from the same page has the explanations embedded:

23explain.png

Let me tell you this. I actually hit 100 on Desktop and 100 on Mobile, but my TTI (Time to Interactive) is a bit slow because I have ads going now. But that goes to show you have TTI and First CPU Idle being a bit slower doesn't impact your score that much. Google doesn't weight it as heavily as it does the first two items in that image.

Those are First Contentful Paint and First Meaningful Paint. The basic idea is this: it doesn't matter what, just get ANYTHING on the page. It might be text that's not even in the font you intend to show the user, but flash that on the screen anyways.

The reason is, your loading acts like a "progress bar" for the user. As long as they see something happening they'll be happy to wait. But they want to know something is happening. If your site shows a blank page for 2 seconds and then BLAM the entire page flashes onto the screen at once, you're going to get a bad score here. Your First Contentful Paint will be 2s and your First Meaningful Paint will be 2s.

Imagine that you pop the text on the page and then everything else waits till 2s to show up. Your First Contentful Paint could be as low as 500 ms, which will drastically improve your scores. And this is what Google is using in their ranking algorithms too.

So the game becomes "How do I get absolutely anything on the screen ASAP and then finish off the above the fold ASAP, the rest be damned?" I've given you a lot of tools to get it done.

I recommend you work on your Desktop scores first, which are measured with "high speed internet" and then work on your Mobile scores, which are measured with some real cruddy 3G speeds to match what people in less wealthy countries are dealing with.

Most of you will never get into the 90+ range on either Desktop or Mobile and that's fine. Just improve. Remember that probably 95% of websites don't even know speed optimization exists. Get your gains, beat your competitors, and remember that this is only one factor of at least 300 in the ranking algorithm, and that's before we talk about side filters. It's a big deal, but it's not a big deal, you know. Don't stress.

I'll come back one day in the future with more info you can apply. I just have to stretch it out as long as I can or I'll run out of material!
 
Anyone have thoughts on HTTP/3 / QUIC? If you're starting from HTTP/1 does it make sense to skip HTTP/2 and get on the newest protocol (whether for speed or other reasons)?
 
Anyone have thoughts on HTTP/3 / QUIC? If you're starting from HTTP/1 does it make sense to skip HTTP/2 and get on the newest protocol (whether for speed or other reasons)?

I wrote at length about HTTP/3 and the benefits of QUIC over TPC here: HTTP/3 is Coming!.

The summary of it is that somehow on HTTP/3 the TLS handshake's 3 steps is processed faster. It doesn't offer any speed advantages over HTTP/2 and its multiplexing of HTTP requests. QUIC is the main benefit.

QUIC packets get "indexed" for lack of a better word so if a packet drops, the entire stream doesn't stop and wait to re-fetch and send that packet again. The entire stream of resources continues to send while the dropped QUIC packet is re-fetched and sent "asynchronously." It removes the bottleneck that can occur and keeps things moving.

I don't know how often packets get dropped in the grand scheme of things, percentage-wise, but in those cases this should shave off a significant amount of time (100 - 300 ms I'd guess).

Also the header compression scheme HPACK is now QPACK. The old way required HTTP headers to come in one at a time in specific order. Now they can come in any order and be multiplexed like the HTTP resources are.

And finally, for the server pushing stuff I mentioned above, this actually starts pushing all resources before the TLS handshake is even completed, which is going to save us all on the order of another 250 - 500 ms in many cases. So server pushing won't be needed, it's built in for all resources instead of specific ones.

It's a huge boost. There's no reason to jump to HTTP/2 if you can go to HTTP/3 first. I'm not sure that it's entirely ready yet. Browsers are still in adoption mode. Cloudflare is ready, I think, and they helped build it. Once it's ready, I see no reason to wait or to use HTTP/2 as a stepping stone. I'd expect HTTP/3 would fall back to HTTP/2 if a user's browser couldn't handle it. Don't quote me on that though.
 
Anyone have thoughts on HTTP/3 / QUIC? If you're starting from HTTP/1 does it make sense to skip HTTP/2 and get on the newest protocol (whether for speed or other reasons)?
I'd say that if your hosting package/provider/system requires you to spend too much time thinking/implementing this then you are doing it wrong. HTTP/2 is a piece of cake with something like WordOps.net.
 
Adding HTTP/2 is extremely simply on NGINX and Apache2 servers. It’s just about editing the configuration files after updating the software. If you already are forcing SSL, which I assume you guys are, you just literally add “http/2” to the ssl line and reboot the server.

There is nothing else you need to really do - unless you use cpanel or some fancy admin GUI interface and they have not added the button to enable it. Commandline wins every time. Most of the time you can just google these things and there are Digital Ocean guides on how.

Nginx: https://www.howtoforge.com/how-to-enable-http-2-in-nginx/

Apache2: https://www.howtoforge.com/how-to-enable-http-2-in-apache/
 
I've posted in the past how I thought my server setup was tight (WordOps - nginx, caching) with plugins like WP Rocket and lazy loading being as much as I could do. I have always got abysmal pagespeed scores for mobile and desktop, but thought it was due to the display ads I run (there is a lot - that's my model) and therefore there was nothing I could do about it. I have tested removing ads in the past to try and get higher ranks and the income/traffic trade-off wasn't worth it.

Anyway this latest Google update and all the chatter around pagespeed really got me thinking. So what I have done is:
  1. Introduced webp images. I use shortpixel to make them and WP Rocket to serve them.
  2. Deferred/inlined CSS and JS with a plugin called Hummingbird (these settings are under advanced options under Asset Optimisation).
I don't use any other options from Hummingbird as WP Rocket takes care of those.

#1 was straight forward. #2 not so much.

Hummingbird is good because it lists all the internal CSS and JS stuff you are loading on your pages (including scripts and CSS from your theme and plugins) and lets you inline, defer, combine etc.

Implementing it involved me ticking each box on and off and then checking several different types of pages on the site (that make use of that CSS/JS) to make sure no functionality was broken or not too messed up.

The end result is that I went from around 18 in Pagespeed to 50 (for desktop). Gained about the same in mobile too.

I will monitor my pagespeed in GA now and see if anything is reflected there, and of course my rankings.
 
Hummingbird is good because it lists all the internal CSS and JS stuff you are loading on your pages (including scripts and CSS from your theme and plugins) and lets you inline, defer, combine etc.

Implementing it involved me ticking each box on and off and then checking several different types of pages on the site (that make use of that CSS/JS) to make sure no functionality was broken or not too messed up.

The end result is that I went from around 18 in Pagespeed to 50 (for desktop). Gained about the same in mobile too.

I will monitor my pagespeed in GA now and see if anything is reflected there, and of course my rankings.

Interesting.

I will experiment and report back.

I'm at 50 google mobile pagespeed for some pages and 90+ Yslow with Webp, size optimization and WP Rocket /Lazy Load.

This might push me into the green.
 
Watch out for shortpixel, it wont work with some plugins and services. For sure on some woocommerce setups. Some images will get converted and some not (managed by plugins for example).
 
Has no one tried out the Ezoic all in one site speed plugin? I referenced it here FYI.

Today I went back and turned on KeyCDN for one of my sites. I have been been toying with the idea of a Kinsta trial but really can't see what they can offer over and above my current setup with WordOps and UpCloud. CDN was the only major thing missing versus what they offer for 25x the price, and so this test will be interesting. No immediate large differences are noticeable using online pagespeed tools, but I will monitor things.
 
Core Web Vitals

Google has announced their intentions to roll their new "core web vitals" into the algorithm. These are three "new" user experience metrics that contain a lot of page speed and page loading metrics within them.

In combination with Mobile Friendly (responsive design), Safe Browsing (no cloaking, cookie stuffing, etc.), HTTPS (SSL compliant) and No Intrusive Interstitials (full page takeover ads and pop ups), they are adding the following:

core1.png

Core Web Vitals:
  • Loading - Largest Contentful Paint (LCP)​
  • Interactivity - First Input Delay (FID)​
  • Visual Stability - Cumulative Layout Shift (CLS)​
Let's talk about these one at a time and consider how we can prepare for this new rollout.

Use the image below as a reference for what are acceptable speeds:

core2.png

YOUR FIRST STEP in correcting any issues with LCP, FID, and CLS is to apply the page speed principles higher up in this thread first. That will solve many of the problems without you needing to dig into specifics.

Loading - Largest Contentful Paint (LCP)

First and foremost, this is mainly a page speed metric, but it's not about how fast the entire page loads and finishes loading. It's about "perceived speed" meaning it's a measure of how fast the user thinks the page is loading.

Google's recommendation is that we reach the Largest Contentful Paint in 2.5 seconds. That should be no problem for most users anywhere near the server, but Google is also using a throttled 3G connection simulated at a further distance. So the truth is you need to be a lot faster than 2.5 seconds to satisfy everyone globally.

LCP refers to, basically, when the above-the-fold region is fully painted with all of the content needed to begin reading the page. That means the top portion of your site needs to render before the bottom so the user feels like it's all loaded and ready.

How do we pull this off?

Server and Browser Caching

First and foremost, you need to be using server and browser caching. We've covered this above and the best ways to achieve it with Wordpress. I still recommend WP Super Cache. If you install it and turn it on, you'll see massive improvements. You can later learn how to tweak the settings for better and/or more efficient results. For browser caching you can use the code shared above.

The idea here is that your server goes ahead and builds the HTML out of all the needed PHP and database queries and saves that HTML so it can serve that immediately without pinging the database over and over.

Prioritize CSS Loading

Your second mode of attack is to prioritize the loading of your CSS. This isn't always necessary if your CSS file isn't huge. There's two ways to do it. One is to inline the CSS code for the above the fold elements. That way when the browser downloads the HTML file the CSS is already baked in and can be rendered. This is a pain in the butt to do unless your entire site uses the same templates on every single page.

Another thing you can do is take all of the CSS that's for elements that are above-the-fold on all pages and stuff them into a separate CSS file. You'll also want to insert the code for the body, the backgrounds, the text styling, and anything else that is above-the-fold but also below the fold, if that makes sense.

What you do then is to link to the main CSS file in the footer, effectively deferring it's load impact, so that the above-the-fold CSS file can load first and faster. Then other elements needed like images and fonts can go ahead and be loaded too. Then finally the below-the-fold CSS can load in the background.

Defer or Use Asynchronous Javascript

A huge culprit these days are giant javascript libraries and files. These aren't going to be needed in order to render most websites unless you're doing progressive web apps and things of that nature. If you are, you can break your files apart just like with the CSS files.

Otherwise, you can defer (make load last) or let them load asynchronously (load at the same time) as to not block the rendering of everything else above-the-fold.

Lazy Load Images Below the Fold

Finally, you want to lazy load images, which are the worst culprit for render blocking the top portion of the screen. You can lazy load images below-the-fold without lazy loading them above-the-fold, though this can get tricky with a CMS like Wordpress, where you'd need to manually tag images inside the post content to not be lazy loaded if near the top. You'll also want to do this with display advertisements where possible, or forgo them above the fold altogether.

Interactivity - First Input Delay (FID)

The First Input Delay (FID) measures how frustrating and confusing a web page might be to a user initially. Google recommends bringing this number below 100 milliseconds.

What this refers to is how soon the page elements can be interacted with once they've rendered. This is mainly a problem with Javascript heavy websites. If you render an HTML and CSS button, but to use them requires Javascript to be ready (like click tracking, destination links being added after the fact, etc.) then there will be a time gap between rendering and usability.

Another example might be clickable regions to make a menu drop down. You can think of this as optimizing the speed in which you load your javascript.

So the problem now is... what portions of your javascript do you defer? Do you need multiple JS files, one prioritized for above the fold? (Probably).

Visual Stability - Cumulative Layout Shift (CLS)

The Cumulative Layout Shift has to do with your page jumping around as elements load. Google wants this to be non-existent, pretty much.

If you're loading all text, you won't have this problem. But if you load all text and then the CSS comes pouring in and styles it, it's all going to shift some. If you're loading an image but didn't use the proper HTML to tell the browser what size of an image to expect, you'll have a shift. If you're lazy loading images and replacing a 1x1 pixel transparent png image, you're going to have a shift.

The only way around this is to have your images not lazy loaded above-the-fold and make sure the browser knows what size to expect in the layout, and to inline the CSS for above the fold so it doesn't load in later. But then if you're not using a system font, you're still going to have a shift once the font you're serving is applied.

Implications

What a giant pain in the butt. For the most part, most of our sites are going to be in the acceptable regions for at least the FCP and FID. I think very few sites will achieve a low CLS. Fortunately, that means our goal isn't to be in the "good" regions or have perfect scores. The goal is to outcompete your competitors, who are undoubtedly all going to suck big time.

How To Measure the Core Web Vitals

Your best move is going to be to use Google's Pagespeed Insights:

core3.png

You'll look great on Desktop, like above. You'll be less good on Mobile, as will everyone else. I described why above.

Other ways to check are to look inside Search Console, which will list out every page with issues or not. Web.dev uses Pagespeed Insights as well. The Chrome DevTools will now show you these vitals, as will the Chrome UX Report (CrUX) that gathers the info through an API that Chrome reports to Google from users all over the planet.

Don't be alarmed if your numbers look horrible. People in poorer countries with bad internet connections with data traveling long distances are going to skew your numbers. I have to assume Google knows not to apply that specific data to your origin country's SERPs or they'll screw up the quality of the results.
 
If you guys ever need to download Google Fonts you can use this helper: Google Webfonts Helper

By downloading the fonts and hosting them locally it helps increase pagespeed and it's one less external call and/or dependency.
 
Back