Core Web Vitals: Google Confirms New Ranking Signals

Ryuzaki

お前はもう死んでいる
Moderator
BuSo Pro
Digital Strategist
Joined
Sep 3, 2014
Messages
6,125
Likes
12,740
Degree
9
To make a long story short, Google made a change in Search Console and now in Pagespeed Insights & Lighthouse concerning some metrics about page speed. They're talking about them in terms of user or page experience metrics, but they relate to loading speed.

What Part of Search Does the Core Web Vitals Effect?
Currently it will only affect the Top Stories feature on mobile (and they're removing the AMP requirement, which was obvious anti-competitive behavior, too).

When Will the Core Web Vitals Be Integrated Into the Main Algorithm?
Google says it will be at least 6 months before the changes are rolled out. Currently they're simply offering a heads up so we have time to prepare.

What Are the Google Core Web Vitals?
Currently Google has four main signals that measure "page experience". Those are:
  • Mobile Friendly
  • Safe Browsing
  • HTTPS / SSL
  • No Intrusive Interstitials
To that bundle they'll be adding the Core Web Vitals:
  • Loading - (Largest Contentful Paint) [LCP]
  • Interactivity - (First Input Delay) [FID]
  • Visual Stability - (Cumulative Layout Shift) [CLS]
That all looks like this:

oFZad3W.jpg

Google says:
“Core Web Vitals are a set of real-world, user-centered metrics that quantify key aspects of the user experience. They measure dimensions of web usability such as load time, interactivity, and the stability of content as it loads (so you don’t accidentally tap that button when it shifts under your finger – how annoying!)”

They also say:
“A good page experience doesn’t override having great, relevant content. However, in cases where there are multiple pages that have similar content, page experience becomes much more important for visibility in Search.”

So basically it's not going to be the strongest factor, but can act as a tie breaker.

Look for them in Pagespeed Insights like this:

paNkD0p.jpg

Core Web Vitals will have the blue flag next to them, as seen above.
 
Google seems to be directly responsible for creating an epidemic of blurred, low quality compressed images.

Vendors and shops are having images that look absolutely like shit, only to shave a milisecond of load time. I can't believe how poor quality images are now being used by shops for the purpose of speed.
 
jumping-through-hoops-cartoon-vector-4045276.jpg
 
If it's not crystal clear that Google is serious about pagespeed and creating an uncluttered mobile experience at this late date... SEOs are dead in the water.

My biggest problem with the Web App is all these damn sites have 4-5 pop ups on mobile devices. Trying to read content is like playing minesweeper now-a-days. I don't want to sign up to your crappy newsletter upon the very first view of a single freaking pixel on your site.

Why are you asking for cookie acceptance to users NOT within Europe? We've still got freedom across the pond over here.

These sites aren't a sensible experience at this point, and it's only getting worse. Thank the gods Google is doing something about it.
 
My initial reaction after reading was “Great! Some new tech I can dive into and start implementing...”

But after giving this some thought I am slowly starting to arrive at the same conclusion that Neil Postman arrived at in 1992. In this case it is technology that is starting to dominate the online culture.

Granted, any person who wanted to be successful online had to start learning the basics like HTML, CSS, and so on.

After a short period people figured out that repeating the same thing over-and-over, and having to adjust the things that sat in the middle -the actual content- was too time consuming and we arrived at WordPress, Joomla, and many others Content Management Systems.

This not only helped the average person to get an online blog going in a couple of minutes, but it also abstracted the technology and technical know-how that sits behind it away from the user.

Easy access to this sort of technology is what helped to accelerate the growth of the internet. One only has to look at the marketshare within the CMS space to see that WordPress dominates. And in turn this benefited behemoths like Alphabet, colloquially know as “Google.”

And indeed, I’m skipping over a million other factors that lead to the rise of the internet and companies like Alphabet. Things aren’t always that simple. It’s the old chicken-or-egg question.

But the point I’m trying to make is that the internet is meant to be open and accessible to everyone. But somehow I’m starting to believe that if we’re
not careful we end up with a two-tiered system.

Technology is slowly transforming the internet into a pay-to-play scheme. Great for the Jeff Bezos’es of the world. Mom-and-Pop can always join their optimized technology stack and get effed-over if the don’t agree to yet another percentage increase.
 
Technology is slowly transforming the internet into a pay-to-play scheme. Great for the Jeff Bezos’es of the world. Mom-and-Pop can always join their optimized technology stack and get effed-over if the don’t agree to yet another percentage increase.
I'd agree.

I was actually going to discuss the idea of srcset in reply to @bernard's post about heavy images. But then I got caught up in finding out what the current implementation was, whether WP used it and if so how, and the concepts of screen resolution and display pixels, etc.

And, you know what, if it is too complicated to bother with for someone that has been creating sites for many years, then it is definitely way over the head of any casual site owner.

I'm not even sure I properly understand the concept of First Input Delay in relation to a web page and a real visitor (and am definitely sure that I find it hard to be bothered)...
 
I'm not even sure I properly understand the concept of First Input Delay in relation to a web page and a real visitor (and am definitely sure that I find it hard to be bothered)...

Well, usually I’m very careful with claims and in the process pretending to know the answer(s), but I’m more than willing to share my viewpoints.

To me this whole “optimise everything” mindset that comes out of the Alphabet pipeline is a double-edged sword. And in my opinion a sword they may even have fallen on unintentionally and unwillingly.

On one side of the equation we have a company that dominates search and under the banner of “do no evil” is cutting down on what @CCarter so beautifully described as “playing minesweeper.” And that is not a bad thing, if I’m being honest.

On the other side we have a company that dominates search, pretending it is not the only viable option out there in an effort to avoid a continued barrage of lawsuits and other sorts of legal trouble.

At the same time it is finding itself responsible for lowering the barrier for entry by playing along for years in the never ending SEO game. A simple example of this is the open sharing of the Quality Guidelines, that is meant for the raters who are sub-contracted to Google via Lionsbridge and so on.

When you combine this with a lot of blood, sweat, and tears that where, and still are, shed by SEOs, you’ll quickly end-up in a position where there is (almost) a double increase of the final result from your service. But when the price for the user stays the same, the demand only increases.

To me adding a layer of technical complexity is only a covert way to increase the barrier for entry, and thus a price increase. We are forced to add yet another skillset in our ever expanding toolkit and will soon hit a point where Joey Blogs is unable to continue with “business as usual” without having to dig deep into his pockets.
 
I'm glad they're providing us more tools, but I'm concerned about how accurate they are. When I look at my sites, the numbers these tests are giving me greatly differ from my real world experience. When I go to my sites on a clean browser / new device (desktop or mobile), a page loads in a second or less. However, when I use Google's tools, it says my site is shit.

For example, this site loads in 1s or under as described above. Using the Core Web Vitals, I get a Mobile score of 61 and Desktop score of 88:

Mobile 61:
5UWC27J.jpg

Desktop 88:
qP7fEKt.jpg
 
When I go to my sites on a clean browser / new device (desktop or mobile), a page loads in a second or less. However, when I use Google's tools, it says my site is shit.

Should be obvious, but are you using WiFi when testing with mobile? If you are in a major city mobile speeds are faster than suburbs or rural areas. All this is taken into consideration when implementing that final mobile score.

As well, the device you are using might be the latest model, but Google still has to account for older models, slower non-5G connections, latency on older infrastructure - even your location during bad weather or other interference.

If you got 1GB download speed at your office but 10MB download at home you’ll clearly see a difference even on desktop.
 
If it's not crystal clear that Google is serious about pagespeed and creating an uncluttered mobile experience at this late date... SEOs are dead in the water.

To quote your other thread before this was even live:

Another thing I want to stress: Google isn't playing around with mobile. If you got a mobile score of <70 - I've seen stuff like 4 or 18 trying to make a decent attempt, it ain't going to work.
 
@thisishatred I've been screaming this from the top of the mountain for a LONG time. But there are people that still believe that they can get away with 180MB+ page downloads on mobile - WHAT? I don't even think you can get away with that on desktop, let alone mobile.

Couple that the overzealous need to monetize every pixel or get random people into your newsletter - it's out of control. And then what happens as an alternative people start using Instagram, TikTok, and other mobile APPs instead of using the Safari/Web APPs - cause at least those platforms aren't cluttered with junk ADs, pop-ups, pop-unders, and just an overall bad user experience.

To clarify, getting people within a newsletter or capturing their information so you can put them into your audience is necessary, actually critical. BUT there is no way that a person that just landed on your site 5 seconds ago is going to be a good candidate or your newsletter, they might be a window shopper or not even your target audience.

If people don't rethink the online user experience of their brand these powerful forces like Google will step in and force you to comply. Google knows in the long run it's in it's best interest to keep the Web APP alive and forcing de-clutterring is a huge step in the right direction.

Now if you've been growing your audience over the years through multiple traffic channels, if Google or Pinterest or anyone does something that reduces your traffic it won't impact you as much since you literally can send an email or direct communication message to your audience and get 5,000+ people to open it within an hour. That power alone keeps you from not having to listen to Google's demands - but if you just want to play in Google's sandbox, then Google's self-interest of making a better Web App experience means you'll need to get inline or suffer.
 
But that's the thing though. I have a desktop PageSpeed score of 100 and mobile score of 99 and still show amber on two out of the four 'Core Web Vitals'.

Plus, given the recurrent and intermittent warnings about mobile on every site just because the Googlebot sometimes can't be arsed to load css and js files, you'll excuse me if I am somewhat sceptical about Google's technical AI abilities.

And let's not even get into the 'you must have a valid cookie and privacy and imprint declaration before you can use Adsense' hogwash...
 
Have they added this signal to the algorithm? I'm seeing massive changes across most of my sites. HUGE changes as high as +28 keyword movements overnight

2Tvh6AC.jpg
 
Last edited by a moderator:
Have they added this signal to the algorithm? I'm seeing massive changes across most of my sites. HUGE changes as high as +28 keyword movements overnight


Not sure if this is the signal, but there were changes to the SERPs for sure starting yesterday and still hitting today.

If you were with SERPWoo, you would have seen it in real time and maybe got the answers why and started a fix.

Red is the trend line. Blue arrows are yesterday and today. This is volatility in our database.

RwTs2M8.png
 
I gotta say, I hate the srcset method mentioned above by @bernard and @ToffeeLa. I actively strip it out and find a middle ground image size for mobile and desktop. The real solution is lazy loading, especially with 5G coming and increased bandwidths and caps.

@DanSkippy, I think of progress as a pendulum. It swings back and forth between two extremes over and over but it ratchets itself upwards one notch with each swing. It might feel like a rocking horse that never moves forward, but it's slowly moving upwards.

Things like Google enforcing page speed might increase the barrier to entry and increase prices (for developers [yay] and Adwords ads [boo]) but then someone comes along and creates a fast Wordpress theme and we're right back to simple CMS stuff. That's the pendulum swinging. The real challenge is to get the user not to sabotage themselves, which is why I make a point over and over to say that BuSo Lightning is for developers who know what they're doing and want a quick jumping off point. It's not for the user that's going to install all the plugins that they keep screwing themselves over with.

A lot of the stuff Google is doing by themselves and in collaboration with other companies is great. Like the work with the Internet Advertising Bureau about non-intrusive ads and then getting ad blockers to analyze that and not block them if they're within the guidelines. Same with page speed. It's a big net positive. But then they do crap like AMP and Facebook Instant Articles which is promoted as a net positive but is a greedy money grab and trying to build moats so you can't escape their properties.

I see Google as doing good and evil at the same time, moving the internet forward while trying to control it. I see stuff like CMS's and WYSIWYG editors and Gutenberg as innocent but moving the internet backwards again. I'm sure some of you remember Geocities and Angelfire and the rest. There's a thread going on right now about Elementor's crazy DOM tree nonsense. That's moving backwards, even though it makes some design customization more accessible.

If someone isn't pushing for optimization, we'll just keep having more ads and visual designers that eat up 99% CPU and GPU power and time and then the only ones making money are AMD and Intel and them.

When I go to my sites on a clean browser / new device (desktop or mobile), a page loads in a second or less. However, when I use Google's tools, it says my site is shit.

That's because they throttle the connection to emulate being a really crappy third world 3G connection. It's the "Leave no man behind" of the internet.

But that's the thing though. I have a desktop PageSpeed score of 100 and mobile score of 99 and still show amber on two out of the four 'Core Web Vitals'.

You're probably fine then. 99% of other sites are getting a 15 to 30 score on mobile and 70 on desktop. You don't have to be perfect, you just have to be better than everyone else.

Have they added this signal to the algorithm?

No. The information is in the opening post.

Not sure if this is the signal, but there were changes to the SERPs for sure starting yesterday and still hitting today.

Yeah, I'm seeing this too. Slowly eroding some traffic. Still above thanks to the gains from previous updates, but they're definitely turning some knobs. I actually have it on my to-do list to check and see what kind of posts are losing ground and try to trigger some freshness where needed.
 
Thanks for the update. The CTO and his team have some work to do.

When you combine this with a lot of blood, sweat, and tears that where, and still are, shed by SEOs, you’ll quickly end-up in a position where there is (almost) a double increase of the final result from your service. But when the price for the user stays the same, the demand only increases.

To me adding a layer of technical complexity is only a covert way to increase the barrier for entry, and thus a price increase. We are forced to add yet another skillset in our ever expanding toolkit and will soon hit a point where Joey Blogs is unable to continue with “business as usual” without having to dig deep into his pockets.

No one ever said capitalism was fair. As Peter Thiel argued in from Zero to One, competition isn't about giving the best product or service to consumers. That's what the populace are told. What competition is about is striving to become a monopoly. So, yes, this would increase the barrier to entry for search engine traffic; however, the barrier to entry can either be a pain point or a source of security. If you can't keep up, you can't keep up.
 
Yeah, I'm seeing this too. Slowly eroding some traffic. Still above thanks to the gains from previous updates, but they're definitely turning some knobs. I actually have it on my to-do list to check and see what kind of posts are losing ground and try to trigger some freshness where needed.

I mostly improved from my losses and interestingly, I improved across the board, on all sites, including those that won (and lost) from the previous update.

Which is interesting as I think it this is another "user experience" update, while the other was a link/authority thing update. I usually lose in link/authority updates, but win back with comparative user experience stuff.

In any case, I've decided to migrate all my sites to GeneratePress anyway, still using Elementor. I will pay someone to do so, any suggestions on who could do such a migration? That is, keep the look and feel and overall functionality, from one wordpress theme to another?
 
I'm not even sure I properly understand the concept of First Input Delay in relation to a web page and a real visitor (and am definitely sure that I find it hard to be bothered)...

It has to do with Javascript, basically. The idea is that, currently, you can load the HTML and CSS real fast and make it appear as if the site is usable but none of the JS has loaded and thus if someone tries to interact with the page but can't, that's a bad user experience.

So basically, say you load up and render a button, and the user can click on it but nothing is able to happen for another 800 milliseconds, you have a 800 ms FID.

Fast loading can actually be a bad user experience if it also implies interactivity that isn't yet available.
 
What the heck are you meant to do with a display ads site?
 
It has to do with Javascript, basically. The idea is that, currently, you can load the HTML and CSS real fast and make it appear as if the site is usable but none of the JS has loaded and thus if someone tries to interact with the page but can't, that's a bad user experience.

So basically, say you load up and render a button, and the user can click on it but nothing is able to happen for another 800 milliseconds, you have a 800 ms FID.

Fast loading can actually be a bad user experience if it also implies interactivity that isn't yet available.
It is somewhat ironic that the major changes that Pagespeed Insights would have me make actually involve Google services (fonts and maps, in my case). Although hardly unsurprising as Google tag manager and analytics randomly stalling are a main cause of slow page load on other people's sites (especially news sites).

Anyway, I've got the 12 pages highlighted to all green in the lab just as an experiment, so we will see if it makes any difference at all.
 
Some more information coming out about the May 2021 PageSpeed update:

BfIPncL.png

- First Input Delay (FID): Measures the speed at which users are able to interact with a page after landing on it. This should occur within 100 milliseconds.​
- Largest Contentful Paint (LCP): Measures the speed at which a page’s main content is loaded. This should occur within 2.5 seconds of landing on a page.​
- Cumulative Layout Shift (CLS): Measures how often users experience unexpected layout shifts. Pages should maintain a CLS of less than 0.1.​

Sauce: Google: All Core Web Vitals Must Be Met For Ranking Boost

6zz5gsA.gif

Google also offers 6 ways to measure this crap (FUCKING SEJ has a TON of popups, I close one, another one pops up, then another, then another WTF: Google Now Has 6 Ways to Measure Core Web Vitals

--

At this point I've given up on Google and their PageSpeed Insight measurements, they've moved the goal post to the fucking moon and we barely have tricycles. I'll be updating the SERPWoo Blog post about the new page experience search signal about this later though.

My main problem is I use the SW contact page as a control, and it was 78 mobile and 100 desktop for years. This page has literally not changed in 2 years. I now put it through PageSpeed Insight, and now the new score is 24 mobile and 89 desktop.

How does the page drop from 78 to 24? Moving the goal post.

Google can go fuck themselves.

1YhyQUX.gif

I'm back to using Pingdom and call it a day. Make sure your page is small in size, loads fast, and low requests, and you are good to go. All this moving goalposts shit just pisses people off cause all my old spreadsheets metrics are dead now cause they were on the old algorithm I guess.
 
Google can go fuck themselves.
Yeah man, I've been saying it around the forum, probably even in this thread. They really overshot this time. Instead of being guided by REALITY they're trying to guide reality based on their EXPECTATIONS. And their expectations are all out of whack.

Meanwhile, we're also supposed to be meeting these metrics for people in the "third world" on 3G throttled mobile connections.

It's just dumb and I'm of the opinion not a single serious person with a user-centric website is going to achieve all 3 of these metrics. Most that try will come to the same conclusion of "go fuck yourself, Google." And the vast majority are already at that point. I'm certainly not going to try.

I had sites measuring 90+ that are now 25, etc. A big one is the Cumulative Layout Shift. Adsense can't meet those expectations. I only know of one single ad network that can, but they also now have to load a blank image of the same size the ad would be. And if the ad doesn't get filled, guess what? You have a black image for users to scroll past.

This also means a lot of the old methods are now useless. Like Lazy Loading needs an entire re-think to make the CLS work out. I've achieved it on my sites, but how many sites aren't going to? (Most of them).

Like we had to choose between FOIT and FOUT (flash of invisible/unstyled text) if we didn't use system fonts, which can't provide a consistent design. Guess what both of those do now? They cause CLS to fail. So now we have to cause a slower First Contentful Paint if we want to provide a consistent design experience without harming CLS.

Think about that. When it comes to fonts you have to choose which one you want to fail: FCP or CLS. Google, go fuck yourself. That's an unwinnable gambit.

If you DO want to pass all 3, that essentially means having a wall of text at this point and nothing else. Have fun de-ranking (or not giving the boost to) everyone. Zero impact on the SERPs. Good use of your time, Google.

A74YNGy.png
 
My guess is they will postpone the launch of the signal/boost.
 
Back