Google Algorithm Updates - 2022 Ongoing Discussion

Status
Not open for further replies.
I got got for about 25% loss of traffic on one site, but my earnings dropped a lot less than that, which was cool. It wasn't so much about losing rankings or dropping a few spots... It was certain posts just disappearing from the SERPs altogether.

The bleeding has stopped and we're back to growing, even though the lost stuff hasn't returned yet.

What's his name again, some guy on a forum was saying from the very start that this looks like a bit of an oopsie on Google's part or that they overshot, so if the old posts do eventually bounce back then we'll be in much better shape than before the update even happened since I'm doing the bathroom sink method and all that, too.


Edit: I'm referring to the June update.
 
Semrush Today
High range
Position changes for numerous sites!
5.1/10

Anyone seeing any changes? I had 2 small sites I never touch lose 85%, was only 100 visitors a day on each so didn't matter, but interesting to see, both affiliate sites using Affiliate Lab method.
Small decline - couple of percentage points
 
I believe this was originally announced as a "2-3 week rollout" on July 27th, but looks like it was finished early? Maybe they just say 2-3 weeks as default without thinking based off previous PRUs? Or maaaaaaaaybe it was fuckt and they knew it? (please god be the last one)

IMG-2190.jpg
 
There is a massive update AND an immediate roll-back happening RIGHT NOW!

2FNXrxM.png


I'm spot checking and domains dropped from yesterday to today and then in real-time the domain is BACK! This looks like they excluded some domains accidentally.
 
There is a massive update AND an immediate roll-back happening RIGHT NOW!
Google had an electrical fire at a data center yesterday and tons of URLs dropped out of the index (but are supposedly being restored). That's what's causing the giant shifts, supposedly. Could be another cover story for indexing problems, though they said three technicians got injured, but Google was offline last night for a lot of people for maybe 10 minutes. Seems true.
 
This algo update, or fuckup, is a major win for me. Up, up, up, big time across several sites.

Very weird, but I hope it sticks.

Edit: I have major gains on all sites, except a new site.

I do have some deindexed, but if this sticks, it's probably the single best update for me ever. It probably won't though. If it is because of deindexing that I've moved up, then it's weird that none of my sites are affected. AI content overreach?
 
Last edited:
fire at a data center
Wow. That's interesting, people need to study this because when I see 33%+ movement in domains/URLs in the Global Stats, that's more than double the severity level of 15% - that means WE (SERPWoo) fucked up or Google made an HTML update we aren't interpreting. But the same amount of data is being collected so that wasn't the case.

This is the first time I've ever seen anything like this at any level in 8+ years of closely monitoring SERPs. This was a crazy event.

So apparently when a Google data-center goes offline it can impact multiple SERPs world-wide so there is singular failure points for the algorithm.

It also means this algorithm is a live algo that takes into account multiple data-centers and will offline a domain IF that data-center's that stores the domain's data is offline.

ZOqiZIQ.gif
 
It also means this algorithm is a live algo that takes into account multiple data-centers and will offline a domain IF that data-center's that stores the domain's data is offline.
I'm no IT infrastructure engineer but wouldn't a company of this size and with this many resources try to compartmentalize things a little bit? Human error, natural disasters, supply chain and logistics disruptions...seems kind of ballsy to have an algo that requires 100% uptime across all 14 (U.S) data centers?
 
I'm no IT infrastructure engineer but wouldn't a company of this size and with this many resources try to compartmentalize things a little bit? Human error, natural disasters, supply chain and logistics disruptions...seems kind of ballsy to have an algo that requires 100% uptime across all 14 (U.S) data centers?
I'd say it had more to do with data propagation. I'm sure they shard all their data needed across several data centers (think like a RAID setup of a redundant array of independent disks (data centers). I'm sure everything has redundancies in one data center, and critical info in each data center is backed up in another. Because no data is safe unless it's in 3 separate locations (not just hard drives or servers).

But I'm also guessing they deploy their spiders and the info they find across all the data centers and start using that data nearly immediately. But I'm also guessing that data is either scheduled or petered out back to other systems in a live stream of bandwidth to have calculations done on them and to have it all "settled" into the live algorithm.

So I can see why some information would drop out if an entire chunk of a data center was dislodged from the system for a while. You can imagine that if X amount of time passes before it comes back online then contingency operations kick in to start feeding "settled" data back in and to recalculate whatever the "live" data was influencing. It wouldn't take much data loss, whether settled and/or live, to shuffle a ton of SERPs around. It wouldn't take much shuffling, relatively speaking, to set off the alarms for all the "weather trackers".

I'm guessing that's how it would all work, but what do I know. I'm just assuming that they have redundancy and contingency plans in place to restore data ASAP, which is what we're seeing play out. Then again it seems like everyone takes backups and never confirms the backups "took" or were backed up correctly. We've seen venture capitalist funded companies delete their entire operations and all client data, find out the backups never worked from day one, and gotten lucky that someone took a backup the day before by pure serendipity. But Google is way bigger than any of that.
 
compartmentalize things a little bit

I'll say it like this, WE (SERPWoo) cache several control keywords so we can visualize what the SERPs look like in cases we don't catch everything.

One of those keywords is "50 cents" - a celebrity that usually has several rich media elements like scrolling twitter, images, People Also Ask and more elements within his SERPs. So if something happens we have a set of control keywords that we manually check the cache against as we update the crawling library for those keywords. Google likes changing HTML/CSS a lot.

This morning (8:45 AM EST) 50cent.com completely disappeared from the "50 cents" SERPs. Normally that would mean Google may have updated the HTML css classes and it's still there we just didn't catch it.

4bBqaZ3.jpg


So we use the saved cached to figure out what to update. Well we looked at the saved cache and that domain was completely gone. That's very odd.

There are certain domains that are just suppose to be there. Example is IMDB.com for actors and actresses. They are going to be there no matter what. Whether we parse them correctly or not is on us. This morning 50cent.com wasn't there, but it was there yesterday (last 90 days screenshot):

Yuu1A50.jpg


So obviously I did a real-time look up, cause the cache wasn't showing and the domain was back:

qf4YvY2.jpg


--

That was an immediate roll-back OR propagation move to a different data-center from the backups from Iowa. But that means 50cent.com was literally based in the Iowa datacenter and could not be reached and was therefore dropped until the backups came back online somewhere else and restored it's data/metrics.

There are just certain control domain that have to be there like IMDB or a person's personal website, and if they drop it's a manual removal or something else.

If a domain is there 89 out of 90 days, that means it's gone through 3-4 Google algorithms and such. They generally don't drop unless something happens.

Now if it was just this one domain that wouldn't be a problem, but for the SERPs to jump to 35%+ volatility that means 35% of all the top 10 domains (3.5 out of 10) dropped for almost every keyword - across millions of keywords. That's insane.

2FNXrxM.png


That just doesn't happen unless something fucked up - Google or us. This is all on Google today - and they did an immediate realtime update which I've never seen before happen in 8+ years of monitoring millions of keywords in the SERPs.

The interesting I can say was there was significant delay of the propagate from Iowa to the backup location. Interesting stuff.
 
Looks like its back to usual. I enjoyed a few hours of success, but that's it.
LMFAO, oh my god. That was funny. So the only way for you to succeed is for your competition to literally be deleted? That doesn't sound too promising...

Fucking SEO, what a waste of time...
 
LMFAO, oh my god. That was funny. So the only way for you to succeed is for your competition to literally be deleted? That doesn't sound too promising...

Fucking SEO, what a waste of time...

Yes that's what I do, sit around hoping to win by half the field going down with stomach flu.

Fortunately, I make almost half my paltry earnings on PPC now, unfortunately also from Google.
 
Last edited:
As expected today's volatility is seeing the "return" back to the original rankings now that Google has "fixed the fire":

m6Mk9YP.png

In the "50 cents" SERPs the domains/URLs ranking from Monday are now back today (Focus Date):

nuBaVwA.png

Google is healing...

Even Wikipedia and Facebook.com were missing, jesus...
 
Two pieces of news (@bernard beat me by like 1 second):

August 2022 Product Reviews Update
Google will be releasing an August 2022 Product Reviews Update in the coming weeks.
Lots of tweaking must be going on for so many (this is the 5th) of these to be releasing.

Helpful Content Update
And perhaps more interesting is that Google will be rolling out a "Helpful Content Update". The summary is that they intend to target websites that are creating content aimed at search engines first and users second (or not at all). Sounds like AI content to me.

This Helpful Content Update will begin rolling out next week and will take 2 weeks to finish. It will be for English language searches globally and come to other languages in the future.

It targets sites sitewide. So if 9 out of 10 pages are unhelpful and 1 is helpful, that 1 helpful one is also doomed. This is not a page-level update, it targets entire sites.

Let's hope this just targets crappy AI sites, content aggregator sites that add nothing to the discussion, and that real eCommerce sites with long content blurbs just for SEO survive. And that there aren't a ton of civilian casualties again.

Glenn Gabe has posted an article summarizing everything he's found, including being on the phone with Google's Danny Sullivan, you can read here. I'm going to paste some questions that Google has added to their many questions we should ask ourselves.

If you answer 'yes' to these, you're on the right track:
  • Do you have an existing or intended audience for your business or site that would find the content useful if they came directly to you?
  • Does your content clearly demonstrate first-hand expertise and a depth of knowledge (for example, expertise that comes from having actually used a product or service, or visiting a place)
  • Does your site have a primary purpose or focus?
  • After reading your content, will someone leave feeling they’ve learned enough about a topic to help achieve their goal?
  • Will someone reading your content leave feeling like they’ve had a satisfying experience?
  • Are you keeping in mind our guidance for core updates and for product reviews?
If you answer 'yes' to these, you're in trouble:
  • Is the content primarily to attract people from search engines, rather than made for humans?
  • Are you producing lots of content on different topics in hopes that some of it might perform well in search results?
  • Are you using extensive automation to produce content on many topics?
  • Are you mainly summarizing what others have to say without adding much value?
  • Are you writing about things simply because they seem trendy and not because you’d write about them otherwise for your existing audience?
  • Does your content leave readers feeling like they need to search again to get better information from other sources?
  • Are you writing to a particular word count because you’ve heard or read that Google has a preferred word count? (No, we don’t).
  • Did you decide to enter some niche topic area without any real expertise, but instead mainly because you thought you’d get search traffic?
  • Does your content promise to answer a question that actually has no answer, such as suggesting there’s a release date for a product, movie, or TV show when one isn’t confirmed?
 
This means Forbes will gain more market share and you will lose.
I mean, if they really mean it when they say content made for search engines vs users, the sheer amount of auto-playing pop ups, unders, and overs on Forbes would seem to indicate that content is indeed targeting users over search engines, albeit in an infuriating way..
 
I'm around much of the Twitter-sphere nowadays. Looking forward to the number of goons that are praising this update now and will be complaining in September about how they did everything 'right' and still got slammed.
 
Fingers crossed this update primarily targets AI generated content. A couple of things they mentioned seems to point in this direction.

- “Are you using extensive automation to produce content

- “a broader effort to ensure people see more original, helpful content written by people
 
I mean, if they really mean it when they say content made for search engines vs users, the sheer amount of auto-playing pop ups, unders, and overs on Forbes would seem to indicate that content is indeed targeting users over search engines, albeit in an infuriating way..

I clicked an actual search result for Forbes recently and it was completely mayhem of clicking things away and ads everywhere. Very bad user experience. I don't understand how Google can let them dominate the SERPs like that, just because they were a big magazine 20 years ago.
 
Yeah it's almost impossible on mobile to accurately exit out of all of those.. I never click on Forbes anymore so yeah, they'll probably hockey stick next week.
 
If you answer 'yes' to these, you're in trouble:
  • Is the content primarily to attract people from search engines, rather than made for humans?
  • Are you producing lots of content on different topics in hopes that some of it might perform well in search results?
  • Are you using extensive automation to produce content on many topics?
  • Are you mainly summarizing what others have to say without adding much value?
  • Are you writing about things simply because they seem trendy and not because you’d write about them otherwise for your existing audience?
  • Does your content leave readers feeling like they need to search again to get better information from other sources?
  • Are you writing to a particular word count because you’ve heard or read that Google has a preferred word count? (No, we don’t).
  • Did you decide to enter some niche topic area without any real expertise, but instead mainly because you thought you’d get search traffic?
  • Does your content promise to answer a question that actually has no answer, such as suggesting there’s a release date for a product, movie, or TV show when one isn’t confirmed?
Maybe I'm misunderstanding things but wouldn't that mean that sites like G2 and Gartner (just search for things like 'Zoom alternatives') are severely fucked as well?

Looking at bullet points: #1, #2, and #5 would probably fuck over 90 percent of all site owners and direct that traffic to news sites like the previously-mentioned Forbes.
 
Status
Not open for further replies.
Back