How Does GZip Work in Relation to Site Speed Tools?

JasonSc

BuSo Pro
Joined
Mar 30, 2016
Messages
109
Likes
144
Degree
1
Question about Gzip.

I am having a conversation/debat with someone. They think the Pingdom size is the entire size of the site and that is most important. I’m more of the mindset of using Coverage in Chrome Devs which shows unused CSS and JS and the total uncompressed size.

The site has a Pingdom total load of 1.1 MB (its wordpress) and loads in 1.57s, which isn’t lighting fast but for a local business site its ok.

Chrome is showing, 2.8MB of CSS & JS with a total page of 5.7 MB. It scores a 12 on Page Speed Insights Mobile, which unacceptable.

I understand Pingdom compare to Page Speed Insights isn’t really apples to apples, but..

I assume that browser must unzip the file then process the file prior to rendering it, loosely speaking.

The larger the unzipped file the more work the browser has to do, potential slowing down the website, first paint and time to interact?
 
@JasonSc, I know this isn't quite what you're talking about, but I'd argue that site size (whether Gzipped or not) isn't what's important. It's about getting the the First and Largest Contentful Paints and reducing the First Input Delay and reducing Cumulative Layout Shift which makes a useable site unusable for a moment while the user reorients themselves.

It's about painting faster and getting to full interactivity, depsite the size of the payload. It's not about payload size but what occurs during the waterfall of downloading those requests.

I'd much rather have a site that loads most everything in say 800 milliseconds and is usable, but still requires another 2 seconds to finish off downloading deferred or asynchronous requests (2.8 seconds total), than a site that finishes everything at the 1.5 second mark but nothing flashes on the screen until 1.5 seconds. The first has a 1.2 second longer loading time but is interactive 700 milliseconds faster.

Anyways, you're going to get much more useful mileage out of any tool Google provides than a 3rd party one, in regards to page speed. Pagespeed Insights, Web.dev, and the Chrome Lighthouse stuff is all going to spank everyone else in terms of what truly matters.

Yeah, the browser receives a GZip file and uncompresses it then uses it, with the idea being that that is still faster than downloading and parsing the raw, uncompressed file. And yes, the larger the compressed file, the more time it takes to uncompress and parse it. You still want to trim it out as much as you can.

2.8MB of CSS and JS is absurd, in my opinion. Sounds like it probably contains the jQuery library, jQuery Migrate, JS from the theme and then several from the plugins (same for CSS).

The problem with that much CSS and JS is that the CSS is render blocking. It has to be downloaded and parsed and applied before you can get a paint going (in most cases). You can defer it but then you'll mess up your layout shifting. We've been put between a rock and a hard place in these regards. You can't defer any more. You have to get those files loaded earlier and faster, and trimming them up is the way to go.

The problem with JS is that it can cause layout shifting if you apply CSS from it, but it also is required for interactivity in a lot of cases (like using a mobile menu). Google is measuring that now too (First Input Delay). Think of an HTML / CSS button that might visually be ready but can't be used because the clicking functionality is programmed through JS.

You'll want to trim up the CSS and JS as much as possible, getting rid of as much as possible too. These days, besides getting a faster Time to First Byte (TTFB) with a fast server and hosting provider, you can also use server pushing.

When the browser connects to your site, it has to do the SSL handshake and then receive your (hopefully cached) index file. It then un-Gzips that and parses it and says "hey, I need these CSS and JS files).

But we know that we'll always need to send those CSS and JS files (except for return visitors, but this systems handles that)... so why wait for the browser to un-zip and parse the source code? Why not send those files along side the index file? That's what server pushing does. You can also tell the browser to pre-connect to 3rd party resources you know you'll need like a Google Font. I talk a lot about this in the Page Speed Optimization day of the Crash Course for anyone intersted (there's more main posts as you scroll down).

Between getting this right (CSS & JS delivery), simple server-side caching (pre-compiling of MySQL & PHP, for instance), and then simply lazy loading images with appropriately sized placeholder HTML regions, you can boost your page speed dramatically without affecting your cumulative layout shift.
 
@Ryuzaki I can always count on you for a great detailed answer. Thank you. I 100% agree with everything you said.

Our conversation didn't even make it to what Page Speed Insights results. We politely agreed to disagree.

The site in question is Elementor on top of a bloated theme, with 32 plugins. The unzipped CSS = 1.2 MB and there is an additional almost 1 MB of JS.

I have come across some bloated sites, but this really takes the cake.
 
@JasonSc, yeah. I've run into some projects like that, where my job was to do speed optimization. The results are constrained to what's possible with that specific website, and sometimes even the absolute best results will still be mediocre.

In some cases it's better to start all the way over, from the ground up, finding or coding a super lightweight them, refusing to let that plugin-creep happen, and paying a VA to convert your content away from Elementor and into Gutenberg without all the fancy extra blocks. Trying to move backwards from off the cliff will take as much time and money as starting fresh, but with much worse results.

As a note, when doing this, keep the URLs all the same so you don't have to do the 301 dance. You'll confuse Google extra if you feed them an entirely new HTML structure as well as 301-ing.

When the redesign isn't an option, the very first thing I do is get a list of the plugins and talk with the site owner, one plugin at a time: "Do you need this? Do you even know why it's installed? Do you know where it's used? Can you live without it?" Chopping those out with a vengeance brings massive sitewide speed benefits, though that can leave you in the realm of "well, it's not as bad as it was, but..."
 
Back