Core Web Vitals: Google Confirms New Ranking Signals

Has anyone ever had any CLS or LCP issues? The only thing I've changed on my site in the past month has been the addition of a sidebar (which contains 2 images with a resolution of 300x250). I think these issues are what have caused my organic traffic to plummet since the beginning of the month.

u6JIaKT.png
 
@DanielS, the Core Web Vitals are not yet a part of the algorithm. Also, when they become a part of the algo, you won't be demoted. You only stand a chance to get a boost from meeting the requirements of all 3 Vitals: CLS, FID, and LCP. At this point, there's no real websites that are meeting these (especially since you have to meet them for people on 3G mobile speeds). There's plenty of discussion on the forum about these Core Web Vitals now if you search, with clues on how to score better with them if you care. I wouldn't worry about these metrics as much as typical page speed, which will improve these metrics as a consequence.
 
THE FIRST CONCESSION HAS BEEN MADE

OHJFIPJ.jpg

Tis but a small victory, but a victory nonetheless.

Now, instead of needing to be lower than the numbers above, you can now be lower than or equal to the numbers.

That's really just a difference of a millisecond on one or 1/10th of a second on another, but it makes a difference.

This image is going around showing the impact it can make:

tuepNd9.png

Suddenly a lot of "Poor" pages are now "Need Improvement". I hope that, and I'm assuming over time they will, even if it's a lot of time, that they'll continue to realize how absurd these requirements remain.
 
I just checked, I'm on 4.5 seconds, I'm never getting under 2.5, not a chance.
 
I've been spending the last week trying to optimize my websites - mostly my main one.

This is the result for mobile test. It's an affiliate site that also has ads from AdThrive on it. I'm doing all I can to reduce the bloat without affecting the user experience.
hOdN7sE.png


I've been spending the last 3 days trying to do anything I can to reduce LCP. It's typically 3.5-4.0

Throughout this time I've been wondering whether I'm an idiot for not being able to solve the issue, or I'm an idiot for spending so much time on it.

I'm a little relieved that veteran users here also think that what Google is asking is unrealistic.
 
wp engine wordpress sites can be gotten to 90s relatively easily with out major coding.

Static sites should already be like 95+ unless you using huge images or weird gradient shit.

We'lle see if things move up the next few weeks.
 
WP Engine setup can be replicated quite easily for 1% of the cost. Most people will be struggling with Ad networks, themes or plugins. There is no excuse not to be 90%+ otherwise.
 
My PSI score jumped from 88 to 94 on mobile. From what I gather, PSI wasn't using HTTP/2 until March 3rd and now it is using it.

I tested just now, after reading this https://www.searchenginejournal.com/why-pagespeed-insights-scores-improved/397763/

If they weren't using it until a few days ago, as I see it, they were returning me bogus stats and causing me to waste time.

I am not extraordinarily experienced in SEO, so I used to believe everything Google said when it came to SEO, but I think my eyes are opening a bit.
 
Yeah, this is a giant "cluster hmm hmm", @nicemkay.

In other news, in about an hour's time I did manage to get my Cumulative Layout Shift to 0 (ZERO) on my sites. And that's with lazy loading, auto-loading display ads, everything.

Basically, everything needs to have a pre-defined height, whether that's an exact height, a minimum height, whatever, and whether that's inline (like image tags) or in the CSS (everything else). Lazy loaded images may be loading a 1x1 pixel image in place of the big image, but that little image needs to carve out a space the size of the big image.

This isn't just above the fold, it's all the way down to the footer. It's wasn't necessarily hard, but some things had slipped through the cracks since I design my own themes and never needed to worry about this.

If you use a pre-made theme, hopefully the developers will do the same thing I did and push an update for you.
 
Where did you get the 0? On the "Lab" test or on the field data?

M024GtM.png


The Lab test is easy. The other one is pretty hard with Ads.
 
Does anyone have a solution for fixing CLS on images that have an auto height? For example an image that is 100% width where the height adjusts according to the width of the browser window.
 
Last edited:
Does anyone have a solution for fixing CLS on images that have an auto height? For example an image that is 100% width where the height adjusts according to the width of the browser window.
You'll need to include the width and height attributes in the image tag itself. The browser can calculate the renderable height from the aspect ratio in creates. So like...

<img src="" alt="" height="300" width="400" />

Even if your CSS stretches or shrinks the image responsively, the browser will understand what the resulting height will be based on whatever the width ends up being.
 
Thanks :smile:
I've defeated the lab tests on my product pages, now to wait 28 days to see if the field tests come back ok. Man, website design never used to be so pedantic.
 
THE SECOND CONCESSION HAS BEEN MADE
(I'll stop with the concession jokes now)

leNOYtH.png

Long Version: Evolving the CLS metric

Short Version:
CLS is the Cumulative Layout Shift. It's when CSS loads, fonts switch, images load, ads appear, and whatever that causes some portion of the page below that changing stuff to jarringly drop down by 10px, 100px, 500px, etc.

It's not that big of a problem in reality. Pages load and then they stop shifting. Google wants no shifting at all, which is what makes this concession even stupider because it's going to forgive sites that keep shifting and shifting while doing crap like auto-reloading different ad sizes into the same reserved spot, or making image sliders continue to slide with varying heights.

They're calling a "Session Window" a period in time where the layout shifts. If there's an X second gap between shifts, it starts a new session window..

What they were doing and testing was taking all the shifts in a session window and creating an average score for that window. Then they were averaging all of the session windows. Here's where it varied:
  • Average Scores for unlimited number of larger session windows (5 seconds and beyond) with up to 5 second gaps between.
  • Maximum Scores for smaller session windows capped at 5 total seconds each and 1 second gaps between.
The first option was broken because improving or even eliminating tiny session windows would make your score worse. For instance, if you had 3 session windows with scores of 35, 45, and 1, your total score is 81 and your average score is 27. If you fix that session window with a 1 score, your total score is better at 80 but your average score is now 40 so you get punished by the algorithm even though you improved.

So they're going to run with the 2nd option, which makes way more sense to make the maximum score the meaningful metric. Capping off the time length of session windows and the gaps... who cares. It's setting them to longer periods of time with shorter gaps in between that's funky.

The example they give is that of a page with a continual stream of layout shifts that occur within 5 seconds of each other, over and over. Google says these pages are annoying but they don't become extra annoying over time. They just stay the same amount of annoying, so there's no need to hammer them into the ground.

On the flip side is, if the session window length is too short, pages with really slow page speeds can end up being rewarded by having their layout shifts be broken across extra session windows.

So the point is to think of a session window as a "batch of layout shifting" that gets one score. Then you add the batches together for a final score. Session windows = batches of layout shifting grouped together in time.

Anyways, the results of this change are...
  • no page will have a worse score due to this change
  • 55% of pages won't see a change in CLS at the 75th percentile
  • most others at the 75th percentile will see only a slight improvement
  • about 3% of them will see their score going from "needs improvement" or "poor" to "good"
  • Those 3% are mainly pages with infinite scrolling or continual shifting that never stops
Let me bust out my trusty calculator. If 55% in the 75th percentile won't see a change, that means 45% will with 3% being significant. I hope I'm doing this right... to bring this into big picture numbers:
  • 31.5% of all pages will see some kind of improvement, if only slight
  • 2.25% of all pages will see a significant improvement
  • 66.25% (the rest of the 75th percentile + everyone else) won't see any negative or positive effect
Yay for 1/3rd of all web pages.
 
In my case, a few things did the trick:
  • Using srcset with a plugin that automatically creates various sizes for images.
  • Combining all the CSS and JS together (separately)
  • Storing assets on a CDN (in our case, CloudFlare)
Screenshot-2021-04-13-114908.png

Images were the main blocking factor for First Contentful Paint, Largest Contentful Paint and Cumulative Layout Shift. We initially tried to reduce the image size through compression/lower quality and have a one-size-fit-all approach, but it didn't work.

Hope this helps.
 

Rollout of Page Speed Experience Algorithm pushed back to mid-June​

https://developers.google.com/search/blog/2021/04/more-details-page-experience
  • The rollout will be gradual and finish at the end of August
  • Search Console has a new Page Speed Experience report
Other unrelated stuff included in this update:
  • The Top Stories carousel will now feature any Google News articles (which now features auto-inclusion instead of applying)
  • The Top Stories carousel no longer requires AMP, nor does the Google News app
  • AMP badge will no longer be shown
  • They're expanding pre-fetching to other signed exchanges (SXG) and not just AMP
"Building a better web, together" through compulsion or damage to your income!
 
I hate this shit, it will cost me to have to update an old theme that isn't at all optimized for this.

This idea that Google has that first world countries run on 3G cellphones is retarded. My site loads in less than 2 seconds, full page, from my cellphone, yet Google wants me to think it loads in 10 seconds.

Absolute retarded. Imagine rolling out a worldwide service and aiming the benchmarks at the Philippines. You'd have to be some kind of equal outcome marxist to do that. Oh wait.
 
So are they basically opening up amp to competition to anyone that can meet their standards?

or am I miss reading things.

If so thats pretty ballzy.

I don't totally hate all of this.
 
I've been trying to get myself to a green level on Google's Core Web Vitals these last few weeks and holy crap is that a shitshow.

If you're running Wordpress, like 60% of the web, you will struggle big time.

I've tried at least 3 different, highly recommended cache/speed plugins and none of them do more than upgrade 10 points or so.

I've upgraded my theme, cut extra css, I'm stilling sub 40 on mobile.

The truth is that Google has set completely unrealistic, counter-intuitive baselines, that will do more harm than good and forcing the internet to become less diverse, less enriched, less embedded, because images, illustrations, youtube embeds all do irreparable harm to the speed score.

This attempt at an "equal opportunity" internet is absurd, in which the baseline is a 3G connection in Bangladesh vs high speed 4G and fiber in Northern Europe.

Fact: Unless you are a very technical, very skilled front AND backend developer, you stand no chance of actually getting a decent score.

Fact: It's not a task that 90% of coders, theme developers or agencies are able to do.

What is worse is that Google's inane insistence on making these arbitrary speed numbers have now created a new and thriving bullshit artist in "speed optimization", that peddle their "fixes" everywhere.

Here's the fix they do: Install a plugin and defer Javascript = instant orange/green. Then charge you hundreds if not thousands of dollars. It is a completely bullshit industry, only created by Google.

Is that what you want Google? Cause that's what you're getting.
 
The truth is that Google has set completely unrealistic, counter-intuitive baselines, that will do more harm than good and forcing the internet to become less diverse, less enriched, less embedded, because images, illustrations, youtube embeds all do irreparable harm to the speed score.

[..]

Is that what you want Google?
The reality is it's about less spammy ADs on pages. Yes it's less revenue for the website owners, but when you go to websites and the experience looks like this:

IMG-7598.jpg


I had to scroll down to get the answer (StackedMarketer Riddle):

IMG-7599.jpg


--

At some point the "web" experience has gone haywire. Regardless of why PasteBin is trying to squeeze every potential cent out of my visit. You got to news sites, you got auto-playing videos following you around, 2-3 popups, one for cookies, and then if you dare close an AD, another one just pops up. And don't get me started on the websites that love auto-reloading themselves to re-do the AD.

At some point enough is enough...
 
  • Like
Reactions: NSG
I don't have those kinds of ads, but I embed Youtube, use a lot of images and illustrations and try to build informative, well formatted pages made for users.

And I know they work, because I have very high page value, the best page value, people tell me they've never seen so high page value.
 
people tell me they've never seen so high page value.

I edited my post to add more, but most websites on the internet aren't clean experiences. So here we are - Google thinks that they will be able to get website owners to drop several AD networks, perhaps in favor of Adsense, and "heal" the web experience.

I think web core vitals are a big waste of time for SEOs, Period. Just something to keep SEOs busy.
 
My experience is that page speed is definitely a ranking factor and it is not inconsequential.

Sites that load very quickly get a significant boost.
 
Sites that load very quickly get a significant boost.
That's true, but you don't need this Web Core Vital crap to get a fast loading site. The old PageSpeed test and really back to Pingdom got the job done. I suspect Google isn't going have a choice and have to rank slower websites when the 99% of the internet cannot meet it's requirements.

It's sort of like this FLOC crap, Everyone is dropping it: Twitter removes FLoC support in another blow to Google

Google's over-reaching with their view of the internet, when in reality the internet is not controlled by an entity. Their control of Chrome handles how most users get to the internet. Adsense/Adwords handle what Ads you see, and their search dominates in terms of sources of data for most users. 99% of user don't care.

BUT, when they try to dictate their tech on-to website owners, like with AMP and this over-reach on Web Core whatever they get serious resistance, especially when it requires a whole host of everyone to get along and get the job done. When they try to dictated how websites should behave it doesn't work out.
 
Back