Google Algorithm Updates - 2024 Ongoing Discussion

https://www.cnbc.com/2024/04/23/goo...warns-employees-of-new-operating-reality.html

A lot of words from Google second in line.

I must say Google under Pichai and this guy Raghavan, who I hadn't heard about, do seem to have taken the company in an entirely different direction. I don't think Page and Brin can be involved at all.

The overall feel I get is Google just turning into any other bottomline company, caring more about quarterly nets than growing through innovation. The question is if they're looking to become Microsoft or Yahoo. I feel as if they hugely overestimate the quality of their search engine now. Even Elon Musk complained publicly now. People are saying Bing is better and for many searches, I honestly agree.

I don't think it's inconceivable that Google could lose maybe 10-20% within the next 5 years to other search engines.
I agree with you.

Last week it happened twice that I got frustrated with Google (as a user) because the SERP was crap/unreliable. I can't even remember the last time that happened. When I searched, all I got was Forbes and the like, plus some Reddit posts.

I went to Bing for the first time in like 10 years (as a user) and was presented with a website that answered my question instantly.
 
https://www.cnbc.com/2024/04/23/goo...warns-employees-of-new-operating-reality.html

A lot of words from Google second in line.

I must say Google under Pichai and this guy Raghavan, who I hadn't heard about, do seem to have taken the company in an entirely different direction. I don't think Page and Brin can be involved at all.

The overall feel I get is Google just turning into any other bottomline company, caring more about quarterly nets than growing through innovation. The question is if they're looking to become Microsoft or Yahoo. I feel as if they hugely overestimate the quality of their search engine now. Even Elon Musk complained publicly now. People are saying Bing is better and for many searches, I honestly agree.

I don't think it's inconceivable that Google could lose maybe 10-20% within the next 5 years to other search engines.
I switched to Bing a while ago. The resutls are better / more complete in my opinion. At this point the only thing what stops most people from swtiching is that they have to get used to the different interface. I will take time as you are saying, but I believe eventually they will lose a fair share if they continue like the last 2 years or so.
 
Also on Bing. I do find it a bit more cluttered than Google (site links and features everywhere), but otherwise it's great. Oh, it sucks for shopping, prices, where to buy stuff though.
 
Even Elon Musk complained publicly now. People are saying Bing is better and for many searches, I honestly agree.

When you Google you should use 'before:2023' and you'll see the difference in results.

Example: Who is Elon Musk?
versus
Who is Elon Musk? before:2023

You guys will thank me later. There are a lot of tricks I picked up and forgot about cause of SW.

Google with before:2023 and you'll be fine. :wink:
 
Wow, like 8 sites of mine just crashed today. Even the ones that have been quite ok during this update. I just checked my charts and it's like a downfacing hockeystick on all of them.

All of them in completely different niches and built in different ways. Some have been untouched for years, others are newer. Some got updated during HCU, others didn't.

But today they all just fell off a cliff haha!

One received an upward hockeystick. It's one of my most spammy looking websites with a horrible UX and no specific niche. Crazy lol
 
The March 2024 Core Update has finished rolling out.

It started on March 5th, 2024 and took 45 days to complete, supposedly ending on April 19th, 2024. Why we're only hearing about it now is interesting. I'm guessing it didn't quite go the way they wanted, but had to call it at some point and start tweaking again, which would line up with what @Trump is seeing today.

It ran along side the March 2024 Spam Update that started on March 5th and finished on March 20th.
 
I have a theory... something I've been thinking about for a few weeks now... I've hesitated to share it because I haven't had time to flush it out... but it's Friday and I'm tired so fuck it.

Here it is...

What if Google doesn't even know what's going on?

Not in the "No one at Google understands the algo" way.

But in the "We don't understand why the results are so bad after the update" way.

What if, the algo is operating exactly as it should...

But, instead of delivering "good results" we're getting the current shitshow because the results are being unnaturally skewed due to a massive exploit... an unsophisticated spam attack being carried out by a nation-state targeting domains with US/Western IPs.

Maybe... MAYBE... we really are on the battlefield here boys.

5RxJSY.jpg


... or maybe we're just a bunch of fucking idiots who didn't keep their backlink profiles clean for the last few years.

In any case, Google sucks. Enjoy your weekend.
 
In the spirit of creating better content, content that can compete with AI... I believe the format that AXIOS is using is pretty sweet/solid.

Concise and easy to skim. All the while, the NYT is still using walls of text.

Anyway, that's how I've been structuring each of my H2 sections.

---

Here is a business idea for all of you that are still wanting to stay in the info game.

Find a local business in your niche and offer FREE LOCAL SEO services to them in exchange for you to have free reign over their "BLOG" section.

28% of small businesses don't even have a website. Then a good chunk of the ones that do have a single page that is still stuck in the 1990s.
 
Last edited:
Here is a business idea for all of you that are still wanting to stay in the info game.

Find a local business in your niche and offer FREE LOCAL SEO services to them in exchange for you to have free reign over their "BLOG" section.

28% of small businesses don't even have a website. Then a good chunk of the ones that do have a single page that is still stuck in the 1990s.
I've been considering this and also taking it a step further with actual partnerships - basically run the Marketing/Business side of things while they focus on the trade/kraft. A lot to unpack there.

Local SEO seems to avoid a lot of the big algo update swings...last of the Mohicans?
 
I've been considering this and also taking it a step further with actual partnerships - basically run the Marketing/Business side of things while they focus on the trade/kraft. A lot to unpack there.

Local SEO seems to avoid a lot of the big algo update swings...last of the Mohicans?
Yes, that's what I'm talking about when I say "free local seo services".

All incentives and interests align.

They rank better locally = more authoritative the info portion of the blog will be.

You guys ever notice how some of these local biz sites will rank on the first page and you look at their backlink profile and they have a grand total of 5-15 domains only?

Imagine what you can do if you juice it up with a couple of 80+DR Haro links.
 
Ok, here is something else that I am thinking. I was digging into the sites of this media https://pillarfour.com/brands

Most of their sites that are structured like niche sites/affiliate/content types are falling.

This one here is a big winner https://www.garagegymreviews.com/equipment , but check its structure. Especially the link I posted. It even uses woocomerce but in catalog mode.

So, I am wondering if there is a structural problem. A lot of the sites that were not affected were local and e-commerce. The majority that got hit we content/niche sites.

Could it be that the rater fed enough bad sites to the googles ai with this structure and it now thinks this is unhelpful factor?

I have noticed quite a few other examples of niche sites that survived, and they don't have the typical niche site structure.
Does anybody else have observations in this direction?
 
So, I am wondering if there is a structural problem. A lot of the sites that were not affected were local and e-commerce. The majority that got hit we content/niche sites.
You’ve hit the nail on the head.

Google says this update was an ‘evolution in the way they identify helpful content’.

The reality is these updates have absolutely nothing to do with the ‘helpfulness’ of your content and everything to do with platform your website is built on and your method of monetisation.

Here’s Catster and Dogster, publishers that introduced ecommerce functionality just before the update:

https://imgur.com/ikCYg0T

https://imgur.com/Fd3QQ8Z

(Note to mod: when I try insert an image I get the error message 'Image cannot be loaded from the passed link')

Here’s Plantura UK, a German ecommerce retailer that carried out the ‘SEO Heist’ model over the last few years to grow their UK traffic:

https://imgur.com/VqK09Lv

This is a German brand that looks as though they’ve never even set foot in a British garden, but they’re suddenly one of the largest gardening publishers by Google traffic in the UK?

This is why there’s an ever-growing cohort of publishers angry at the gaslighting by Google and their flying monkeys on X. You’re told your content isn’t ‘helpful’ enough, but it has nothing to do with content quality, EEAT – the differentiators that move the dial are so arbitrary and ultimately based upon Google’s corporate interest, rather than any legitimate attempt to improve the quality of the search results.

These updates are purpose-built to be punitive to publishers. 97% of the 1,500 sites that dropped the most in visibility since the first HCU have ads:


In public this is Google’s sticking plaster for de-incentivising the proliferation of AI content. But you also have to look at this through the lens of Google’s self-interest:
  • Dissatisfaction with the organic results = more ad clicks
  • Funnelling traffic to retailers helps to boost the balance sheet of the same businesses who will reinvest in Google ads
  • Driving more traffic to Reddit helps them train their AI models on new UGC
  • Makes for a convenient excuse to stifle and kill publishers before stealing their content through SGE
Publishing is no longer a viable business model for small and medium sized websites while Google retains a stranglehold on the open web. I feel sorry for webmasters on X when I see they are still investing time and money into making their website more 'helpful' to Google.

Until the FTC / DOJ / CMA steps in this is the new reality – with every update they’re dialling up the signals. If you haven’t diversified away from display ad / affiliate monetisation through Google then prepare for your business to die.
 
You’ve hit the nail on the head.

Google says this update was an ‘evolution in the way they identify helpful content’.

The reality is these updates have absolutely nothing to do with the ‘helpfulness’ of your content and everything to do with platform your website is built on and your method of monetisation.

Here’s Catster and Dogster, publishers that introduced ecommerce functionality just before the update:

https://imgur.com/ikCYg0T

https://imgur.com/Fd3QQ8Z

(Note to mod: when I try insert an image I get the error message 'Image cannot be loaded from the passed link')

Here’s Plantura UK, a German ecommerce retailer that carried out the ‘SEO Heist’ model over the last few years to grow their UK traffic:

https://imgur.com/VqK09Lv

This is a German brand that looks as though they’ve never even set foot in a British garden, but they’re suddenly one of the largest gardening publishers by Google traffic in the UK?

This is why there’s an ever-growing cohort of publishers angry at the gaslighting by Google and their flying monkeys on X. You’re told your content isn’t ‘helpful’ enough, but it has nothing to do with content quality, EEAT – the differentiators that move the dial are so arbitrary and ultimately based upon Google’s corporate interest, rather than any legitimate attempt to improve the quality of the search results.

These updates are purpose-built to be punitive to publishers. 97% of the 1,500 sites that dropped the most in visibility since the first HCU have ads:


In public this is Google’s sticking plaster for de-incentivising the proliferation of AI content. But you also have to look at this through the lens of Google’s self-interest:
  • Dissatisfaction with the organic results = more ad clicks
  • Funnelling traffic to retailers helps to boost the balance sheet of the same businesses who will reinvest in Google ads
  • Driving more traffic to Reddit helps them train their AI models on new UGC
  • Makes for a convenient excuse to stifle and kill publishers before stealing their content through SGE
Publishing is no longer a viable business model for small and medium sized websites while Google retains a stranglehold on the open web. I feel sorry for webmasters on X when I see they are still investing time and money into making their website more 'helpful' to Google.

Until the FTC / DOJ / CMA steps in this is the new reality – with every update they’re dialling up the signals. If you haven’t diversified away from display ad / affiliate monetisation through Google then prepare for your business to die.
So add a shop onto the domain and wait for reclassification?
 
Could it be that the rater fed enough bad sites to the googles ai with this structure and it now thinks this is unhelpful factor?

I have noticed quite a few other examples of niche sites that survived, and they don't have the typical niche site structure.
Does anybody else have observations in this direction?

This is probably the main reason.

Google is running on machine learning based on their manual rater feedback. You just feed the machine learning algo 10.000 bad niche sites and the machine learning algo figures out some common factors.

Keep in mind that these common factors don't have to be on-page, they can also be off-page or user metrics, because Google has access to all those.

So add a shop onto the domain and wait for reclassification?

To me, this just seems like an attempt at delaying the inevitable, but what do I know.

I'd personally rather figure out a way to add value, that you have something people want outside of Google. Something that will get you shared, linked and so on. A tool, a registry, a real life involvement, something that will have real people talk about it.
 
I imagine it's a mix of signals that signify you're an ecommerce retailer. But ultimately convincing the algorithm that your primary method of monetisation isn't via display advertising / affiliates. As George mentions above and I read somewhere else (probably on this forum) - burying the blog deep within the site structure and utilising the homepage / main nav for a 'real business' purpose such as ecom / product / service is likely a good idea.
 
A lot of the sites that were not affected were local and e-commerce. The majority that got hit we content/niche sites.

Could it be that the rater fed enough bad sites to the googles ai with this structure and it now thinks this is unhelpful factor?

Is anyone also amazed at the degree of ACCURACY that the HCU struck, with very little collateral damage (local sites, e-com, e-com with blogs, news sites, e.t.c, are all fine)?

I've been thinking a lot about this, especially from an implementation standpoint and i have a theory.

My Theory: The HCU targeted websites with "unnatural" amount of keyword-focused articles. It has nothing to do with content quality itself.

Why?: "Niche site" creators/mini publishers are the most likely people to abuse AI content and these people are heavy on keyword research and optimization. It makes sense to wipe/discourage these set of people to reduce (temporarily?) the volume of AI content they have to deal with. This probably also aligns with whatever agenda (SGE, Reddit IPO, e.t.c) they have going on.

Implementation: Implementation was likely content based, since google already has the total index count for each website in their database.

Implementation probably looked like this:
  • Use QRG feedback to create machine learning algos to better understand optimization patterns for sites that typically chase keywords.. E.g (i) Most niche sites have strictly keyword rich titles (ii) Most niche sites have same H1 and H2.
  • Create a treshold for informational queries overoptimization.
  • Evaluate each website index in the database for optimization treshold
  • If index exceeds treshold, Classify as "unhelpful", demote by X %, and keep demoting by X% after every Y days or until the index goes below the treshold, then remove classifier.
Let's take thatfitfriend.com for example:
  • Total pages: 656 indexed pages
  • Query-focused pages: 600
  • Non-Query focused pages: 56
  • % of query-focused pages: 93.3%
  • Google HCU: 93% of your pages/website/content is targeting an informational query. That's unhelpful. Demote!
Only "niche sites" with ads and aff links would have this level of "unnatural" index.

Why content-based evaluation and not structural?: Because it is likely the best way to hit the nail on its head with minimal collateral damage. Structural evaluation is too risky to do on a large scale (the entire web) without heavy consequences.

I think most of our structural observations are just symptoms.

Why this could be true?: The degree of accuracy here is quite good if you consider that the HCU targeted "content". Content websites in this case would also include news publishers, since news is also "content" yet niche sites are the only people that got absolutely decimated!. I think this might be because most news content are "in the moment" and not heavily optimized. So, the news content significantly decreases the optimization treshold/index for new sites.

Why no recovery so far?: This is where it gets interesting..

What if we've not been accurate with our recovery approach?

Most people have been rewriting content, deleting content, removing ads, e.t.c, which can actually help reduce the optimization treshold for your existing index, but what if the solution is in 2 parts: Reduce Optimization AND massively increase your overall index count by adding "unoptimized" pages?

Maybe Google wants you to show them 2 things: (i) That you're repentant (i.e remove optimized content) (ii) That you're committed to improving your website's growing index (i.e the future pages/content that'll be added to your index will be "unoptimized")


A lot of people have done part 1 but not part 2.

This is where ideas like adding shop, product pages, e.t.c, starts to make sense because adding them is a quick way to significantly increase your index count (and you get the extra benefit of looking like a real business).

I remember there was a time Lily Ray tweeted about adding /shop/ and a Google spokesperson quickly responded to downplay the idea (as usual), then she took back her words. I felt she was on to something here.

Its either this or Google didn't setup HCU for recovery, and only them have the power to remove the classifier... OR maybe they've lost control of the algo. LOL.

I'll love to hear what you guys this about this.
 
Last edited:
Maybe local sites generally don't get scrapers/spammer's attention. Whereas sites going after national and international terms will grab spammer's attention. Check the backlink profiles of the sites hit above to verify.

You guys are making it more complicated than it is.
 
^^^^
Ding ding ding ding winner.

It's fundamentally the same as 2014. Filters, index updates, black lists, white lists.
Those just got upgraded a level in march.
Humans generalize content. Machine learning based language models can to; but not in a way you can build practical database's from.

This is just google trying to retake control of their serps, but somehow walking backwards and defaulting to including less players in the eco system to try and control spam.
 
Last edited:
unnatural" amount of keyword-focused articles.
I think you're on to something here. Maybe not in terms of recovery, per se, but in terms of site classification or footprints, if you will.

I hadn't thought about this much recently, but to me I think it's likely a big factor in determining what a site's purpose is - i.e. is this an "SEO site."

And if you look at how Google has always acted, they're antagonistic towards SEOs - white hat, black hat, retard hat - at the end of the day we're all trying to manipulate their search rankings.

Despite their intelligence-agency-grade propaganda arms positioning themselves as if they're "helping" people rank in their search engines, look at how they act rather than what they say - they're always showing us "WE DON'T LIKE SEOS."

So, I say that to say. I like your idea... it fits with my current "don't act/think like an SEO" ethos. And is something I'm going to try to do on all of my sites moving forward.

Really, I think this issue is probably a major giveaway. This makes more sense to me than a structural thing, but it could also be structural who knows...

massively increase your overall index count by adding "unoptimized" pages?
The best performing sites lately are always FULL of unoptimized pages. Pages about the site topic, but completely unoptimized towards a keyword.

Of course, they all also usually sell something and don't just offer up content for content's sake, except major media publications and news sites like you said. But those publications do have super active social media and don't for a minute rely solely on SEO traffic or SEO content.

They are not at all "SEO Sites."

Most of us have... SEO sites.

It'd probably a good idea to do some topic research, and completely ignore the "keywords research" part of finding content ideas - brainstorming things to talk about related to your topic even if there's no interest or search intent behind it.

because most news content are "in the moment" and not heavily optimized

Bingo, I remember thinking of re-adding a news section to some of my sites but haven't done it yet. I still think this is the way, though.

Padding out your content with stuff that doesn't have a keyword focus sounds like a great idea.

Final thought... I think most recovery attempts are futile or wasted time/resources unless the site was a 5+figure monthly money maker. Would make more sense for most others to start fresh - brand new (with an expired domain) with a tight set of highly keyword optimized pages that will make you money and about 5-10x more non-keyword focused pages about the topic in general. Be that news, e-comm pages, product/service-led pages, etc.

Edit: Also I think recovery is definitely more links based (scraper sites, anchors, image hotlinks) than content based, but could be a combination of both.
 
Last edited:
Seriously... this has nothing to do with over-optimization and you're not going to WIN by de-optimizing.

What G means when they say "search engine first" content is their code word for INFO + AFFILIATE sites will get whacked. Most likely because they compete with G for revenue.

AD density
Optimization
EEAT
Helpful content

Those rules don't apply to sites that are NOT categorized as info/affiliate.

---

Whacking every info/affiliate site on this forum won't even make G bat an eyelash.

However, you know what is unacceptable? A new store/biz that opened up down the block from you and it does not exist on maps nor the SERPS. Now THAT is a catastrophic failure for G as a search engine.

Local biz get a loooooooooot more leniency... just think about it. The path forward is to exploit local.
 
I agree with certain aspects of what everyone is saying... but I also know of clear exceptions to every example of "this is what Google is doing" given here and on every other platform from X and Reddit to BHW... ranging from theories about EEAT to engagement, and topical to longtail, and every idea in between.

I know of pure info sites that are absolutely killing it with not a single author listed, affiliate sites that are still ranking, e-commerce sites that are getting slapped, and other sites that should be doing well (according to what everyone thinks) that are getting smashed.

And... when you consider that the Google algo is an algorithm... the discrepancies in the SERPs do not add up. The calculus is off.

That said, there is one message resonating with me... that I have not yet and may never disprove.

@CCarter and @secretagentdad mentioned it above... it's the strategy outlined by @Grind that is currently making waves on BuSo over in the Massive Links thread.

Yea, I know I'm at risk of sounding like an evangelist... but given that I can disprove everyone's explanation of why sites are failing with clear examples of sites still killing it, I'm leaning toward the possibility that massive spam attacks are at the source of most people's woes.

This is only strengthened by the examples shared here and elsewhere, with people seeing the attacks and seeing the influx of massive amounts of links with /1000 and //1000 and not doing anything about it because Google says it's cool, you can relax...

And then, surprisingly, these sites with thousands of spam links are getting torched to hell... but it's because they're missing an author box... or an address... or they use the most popular CMS in the world...? Feels off. The numbers don't add up.

At the end of the day, I'm hunting for the solution to a math equation... and right now, I feel like we're getting close.
 
Back