Google Algorithm Updates - 2020 Ongoing Discussion

Status
Not open for further replies.
2 of the sites which jumped up in April'20 back to original positions. Similarly, 2 sites that went down around the same time bumped up. Typical core update thing as of now!
 
RED EVERYWHERE

Looks like a pretty big update, just big enough to send panic through marketing departments and get on the phone with their adwords reps.
 
Yeah, I feel like for me I started seeing small hints of it two days ago but today was big.

Site 1. 50% drop, 2. 32% drop, 3. 45% drop, 4. 18 % drop, 5. 15% jump 6. 89% jump. Strangely the sites that dropped were my tech niche sites and the jump was my sports sites.
 
One of my newer sites had some big drops, others consistently red, but small drops and one site improved.

It's too early to tell, they usually bounce back 75% or so within a week, but I would be interested to know why I seem to get hit by most updates the last 2 years. It's becoming a pattern of 2 steps forward, then 1 step back with each update.
 
My new website had a little dip yesterday but is already on track to do the same or better today (as it was doing earlier this week). As @JamaicanMoose said, for me, it's business as usual until around Christmas.
 
My new website had a little dip yesterday but is already on track to do the same or better today (as it was doing earlier this week). As @JamaicanMoose said, for me, it's business as usual until around Christmas.
Seems as though newer sites (less than 2 years old) are the ones that are getting hit the worst, which makes sense because Google likes to fidget with the rankings signals (especially with the pre-Christmas updates). In this case, it being age/authority. Like a couple of people have said here, many times these updates end up correcting themselves, in some capacity.

Also, I have a couple of clients who are in sketchy niches and are constantly neg SEOed. All of them have gotten crushed with this update. I also have a few rep suppression clients who I hammer negatives with massive amounts of links and all of those got hammered too, so there is definitely a Penguin element to this update.
 
The one site of mine that was hit for real was a new site built on an expired domain. I would say good content, but who knows.
 
So far I'm skating by unscathed. Of my main projects, one is almost 7 years old and the other just over 1 year old (on an aged domain with existing links). Both are holding up their regular traffic patterns thus far.

I'm not seeing any meaningful change in the quality of traffic either (number of clicks on affiliate links, RPMs of ads shifting, etc.). Seems like another day in the neighborhood.
 
I've got a project I helped a friend with.

It was actually part of the series on SERPWoo about ranking a new site from scratch where I was able to hit #1 in just a few months of starting for it's big B2B term. With no links or content....

That project was hit by this recent update.

So the only thing I can think of off the bat is:
  • No to low content
  • No age
  • No backlinks
Which, is a lot of course.

Most newer sites don't have a lot of content or backlinks.

What I'd like to do is dive into which one ( it could be all though ) is really hammering sites this time.

So can we find sites that are older with low/no content and no backlinks ranking?

Can we find new sites with lots of content and no backlinks ranking?

Can we find old sites with lots of backlinks and no content ranking?

It will be a dig for sure into the data the next few weeks. I'm thinking you are going to have to have all of it though to be ranking right now.
 
I wish someone could shed some light on this because I can’t figure it out. My site got hit with a 60% drop in traffic and I’m pretty un-motivated. Went from $20+ a day to $2-$4 a day now. Really upset. Being told to make more content doesnt really solve this in my mind. I guess the only thing to do is wait and see what happe s between now and the next update? Anyone have insight on this? It hurts the new guys like me who are pumping out 4+ articles a day.
 
I mean, if we go by the motivation as being evil by Google (increase adspend), then it definitely makes sense to hit the upstarts, as the established players likely already have maxed out adspends.

I wish someone could shed some light on this because I can’t figure it out. My site got hit with a 60% drop in traffic and I’m pretty un-motivated. Went from $20+ a day to $2-$4 a day now. Really upset. Being told to make more content doesnt really solve this in my mind. I guess the only thing to do is wait and see what happe s between now and the next update? Anyone have insight on this? It hurts the new guys like me who are pumping out 4+ articles a day.

Have to work really hard on establishing a strong backlink profile and trust.

Like I said, 2 steps forward, 1 step back (updates), is the way to go it seems.
 
Thanks @bernard, this helps motivate me. A 60% drop hurts as a new guy in the space, but i suppose this is the key. I have no backlinks or trust built up
 
When people respond about their improvements or drops in rankings and traffic after these updates, a big question to ask is about what they're NOT saying...

For instance... how many were using cheap PBNs that got popped? How many were ranking on the merit of low quality backlinks in general (anything that's not contextual). How many were ranking because their anchor text usage ratios were jammed through the roof?

All of these things are what I refer to generally as "link data" that gets crunched between updates and rolled out live. They can say all day that Penguin and Panda are now (and there's some phrase they use for it that I can't think of) are current and live in the algorithm, I'm 100% sure they're active only in the sense of being "rolling" updates.

And this also leads me to sites like @AhFreshMeat's and others on the forum where I mention this a lot: New sites that come out of the gates swinging will always perform better than they should. Then they experience a correction once Google does the first set of calculations on their link data. It's not a "penalty" but a return (or first time experiencing) where they SHOULD be ranking.

That's why this makes sense too, and I've used the exact same phrase regarding updates:

Like I said, 2 steps forward, 1 step back (updates), is the way to go it seems.
It's 2 steps forward because there's a window of time between updates where URLs can perform better than they should before the correction occurs. This goes for new sites and new content on old sites too.

Google is better off letting a URL out-perform itself than holding them back even more than they already do. There's already enough confusion built in with rank transition filters and delayed, throttled results, all meant to confuse spammers.

But if you aren't giving off spammer signals, you can do better than you should. Spammy sites can't do this after they surpass in time their first "window of opportunity" between updates. But they can before the first one, which is why you can still churn and burn, whether or not that's worth your time.

So for someone like @AhFreshMeat and anyone with relatively young sites (whether in age or in earnings), I'd argue that you don't have enough page rank flowing. This is typically the answer for a lot of problems that are talked about on the forum like "why are my pages bouncing?" and "why does my site get crawled so slowly?" and "why does Google index such a low percentage of my content?" and so forth.

I agree with @bernard in this regard. The answer is a higher quantity and a higher quality of links with natural anchor text profiles.

This is something that applies to every update since Penguin rolled out and to most problems people experience.

You can take these same "out-perform itself" and "over-optimized" aspects of link profiles on new sites and "in between updates windows" and apply it to on-page. I haven't said this on the forum but I've even seen keyword stuffing back into play between certain updates before the data gets crunched and they get whacked.

Matrix SEO Analogy
It's the same old game I've talked about since before BuSo existed. The Matrix movie is the perfect analogy for all of this. Neo is the anomaly in the Matrix. He's the weakness in every algorithm, the one exploitable variable that's weighted too highly. But just like in the Matrix, the Architect KNOWS that The One weakness will always exist, so the real move is to control him and get him to move and act as the Architect pleases. Then the next update comes, The One dies and is resurrected and a new vulnerability in the algorithm now exists. By the time you target it and exploit it, it's time for it to die and for you to be outted as a spammer, and for the weakness (the window of opportunity) to be moved to another spot on the wall. Anyone caught aiming at that window will find themselves splatting against the wall once that window moves again, and your splat on the wall is evidence you are a spammer.

The way out of this is to stop being targeted by the machine as a potential spammer. And the way to do that is to build trust and authority. The way to do that is:
  1. Let your site exist for over a year
  2. Don't give off spam signals on-page or off-page
  3. Build brand signals
  4. Get a high quantity and high quality of contextual backlinks with natural anchors
Do that and rinse and repeat on the same site update after update and you'll see the fluctuations in your properties during each update get minimized because your signals aren't being discarded during the offline data calculations they roll in. And you get leeway for being trusted and authoritative.
 
Someone mentioned age/DA above. I think that is true in this case. My older sites are fine, but a new site on a brand new domain (about 1 year old) that has been doing REALLY well (I make killer text guides with videos) has tanked a little - say down 20% to 30%. A little disappointing TBH since I had started to become optimistic in Google telling us that the best content always wins, and I was competing against much bigger, older, established sites. The site is still doing well mind you, above expectations, just not as well.

My other, older sites seem to have had minor traffic increases. All in all I am not seeing any massive changes.
 
I just don't see how my sites would fall into any of the spam criteria. I don't use PBNs at all, but many others do and they don't get hit.

I have far fewer backlinks than my competitors usually, but I guess my on-page is on point and that my UI signals are good, but that I get pulled back from not having enough links.
 
When people respond about their improvements or drops in rankings and traffic after these updates, a big question to ask is about what they're NOT saying...

For instance... how many were using cheap PBNs that got popped? How many were ranking on the merit of low quality backlinks in general (anything that's not contextual). How many were ranking because their anchor text usage ratios were jammed through the roof?

All of these things are what I refer to generally as "link data" that gets crunched between updates and rolled out live. They can say all day that Penguin and Panda are now (and there's some phrase they use for it that I can't think of) are current and live in the algorithm, I'm 100% sure they're active only in the sense of being "rolling" updates.

And this also leads me to sites like @AhFreshMeat's and others on the forum where I mention this a lot: New sites that come out of the gates swinging will always perform better than they should. Then they experience a correction once Google does the first set of calculations on their link data. It's not a "penalty" but a return (or first time experiencing) where they SHOULD be ranking.

That's why this makes sense too, and I've used the exact same phrase regarding updates:


It's 2 steps forward because there's a window of time between updates where URLs can perform better than they should before the correction occurs. This goes for new sites and new content on old sites too.

Google is better off letting a URL out-perform itself than holding them back even more than they already do. There's already enough confusion built in with rank transition filters and delayed, throttled results, all meant to confuse spammers.

But if you aren't giving off spammer signals, you can do better than you should. Spammy sites can't do this after they surpass in time their first "window of opportunity" between updates. But they can before the first one, which is why you can still churn and burn, whether or not that's worth your time.

So for someone like @AhFreshMeat and anyone with relatively young sites (whether in age or in earnings), I'd argue that you don't have enough page rank flowing. This is typically the answer for a lot of problems that are talked about on the forum like "why are my pages bouncing?" and "why does my site get crawled so slowly?" and "why does Google index such a low percentage of my content?" and so forth.

I agree with @bernard in this regard. The answer is a higher quantity and a higher quality of links with natural anchor text profiles.

This is something that applies to every update since Penguin rolled out and to most problems people experience.

You can take these same "out-perform itself" and "over-optimized" aspects of link profiles on new sites and "in between updates windows" and apply it to on-page. I haven't said this on the forum but I've even seen keyword stuffing back into play between certain updates before the data gets crunched and they get whacked.

Matrix SEO Analogy
It's the same old game I've talked about since before BuSo existed. The Matrix movie is the perfect analogy for all of this. Neo is the anomaly in the Matrix. He's the weakness in every algorithm, the one exploitable variable that's weighted too highly. But just like in the Matrix, the Architect KNOWS that The One weakness will always exist, so the real move is to control him and get him to move and act as the Architect pleases. Then the next update comes, The One dies and is resurrected and a new vulnerability in the algorithm now exists. By the time you target it and exploit it, it's time for it to die and for you to be outted as a spammer, and for the weakness (the window of opportunity) to be moved to another spot on the wall. Anyone caught aiming at that window will find themselves splatting against the wall once that window moves again, and your splat on the wall is evidence you are a spammer.

The way out of this is to stop being targeted by the machine as a potential spammer. And the way to do that is to build trust and authority. The way to do that is:
  1. Let your site exist for over a year
  2. Don't give off spam signals on-page or off-page
  3. Build brand signals
  4. Get a high quantity and high quality of contextual backlinks with natural anchors
Do that and rinse and repeat on the same site update after update and you'll see the fluctuations in your properties during each update get minimized because your signals aren't being discarded during the offline data calculations they roll in. And you get leeway for being trusted and authoritative.
This is a great post and will help many of us searching for answers.

i dont have any backlinks at all and my site is 9/10 months old. Perhaps this is the issue
 
Just to provide you guys with more data. My 7 month old site has held its pre Black Friday traffic. Interestingly, the corrections impacted only the newer articles that recently got higher rankings for some keywords. But the articles occupying those higher positions now are trash/irrelevant/Reddit, so I am confident that it’ll be reversed.

For anyone wondering, the site does have some backlinks but not too many.
 
17 organic clicks on December 3rd, 3 organic clicks on December 4th. 510 organic impressions on December 3rd, 232 organic impressions on December 4th. Awesome- glad I wrote blog posts for 6 hours a day for over 2 months just to have my 100% whitehat website get destroyed for no reason.
 
17 organic clicks on December 3rd, 3 organic clicks on December 4th. 510 organic impressions on December 3rd, 232 organic impressions on December 4th. Awesome- glad I wrote blog posts for 6 hours a day for over 2 months just to have my 100% whitehat website get destroyed for no reason.

Hey buddy, it's okay. Going from 17 clicks to 3 doesn't mean your site is destroyed, that's still well within reason when you're dealing with small numbers and it's far too soon to see or worry about any trend. It hasn't grown enough yet to be destroyed, you're good. If this were my site, I wouldn't think twice about going from 17 to 3 to 30 to 64 to 11. This is far too small of a sample size to glean anything from. You'll see a bigger % of variance when dealing with smaller numbers.

Deep breath. Keep your eyes on the prize, keep doing the work, you'll be good. You didn't start this site to give a fuck about a couple of visitors, right? If you went from 3k to 3, then it's time to sound the alarm, but this is just part of the journey and you're still on your first steps. You're here for thousands and thousands a day. As far as tings to worry about, I would be more concerned about spending 6 hours a day for 2 months writing about stuff that nobody is searching for. But in any case, you've built yourself a foundation here, you've probably learned a lot just by doing, now it's time to heed the advice many have given you, @Ryuzaki in particular because this is his bread and butter and he does the stuff he writes about, and start going after keywords that are being searched for. You'll be alright, everyone's here to help, eyes on the prize my duderino.
 
When people respond about their improvements or drops in rankings and traffic after these updates, a big question to ask is about what they're NOT saying...

For instance... how many were using cheap PBNs that got popped? How many were ranking on the merit of low quality backlinks in general (anything that's not contextual). How many were ranking because their anchor text usage ratios were jammed through the roof?

All of these things are what I refer to generally as "link data" that gets crunched between updates and rolled out live. They can say all day that Penguin and Panda are now (and there's some phrase they use for it that I can't think of) are current and live in the algorithm, I'm 100% sure they're active only in the sense of being "rolling" updates.

And this also leads me to sites like @AhFreshMeat's and others on the forum where I mention this a lot: New sites that come out of the gates swinging will always perform better than they should. Then they experience a correction once Google does the first set of calculations on their link data. It's not a "penalty" but a return (or first time experiencing) where they SHOULD be ranking.

That's why this makes sense too, and I've used the exact same phrase regarding updates:


It's 2 steps forward because there's a window of time between updates where URLs can perform better than they should before the correction occurs. This goes for new sites and new content on old sites too.

Google is better off letting a URL out-perform itself than holding them back even more than they already do. There's already enough confusion built in with rank transition filters and delayed, throttled results, all meant to confuse spammers.

But if you aren't giving off spammer signals, you can do better than you should. Spammy sites can't do this after they surpass in time their first "window of opportunity" between updates. But they can before the first one, which is why you can still churn and burn, whether or not that's worth your time.

So for someone like @AhFreshMeat and anyone with relatively young sites (whether in age or in earnings), I'd argue that you don't have enough page rank flowing. This is typically the answer for a lot of problems that are talked about on the forum like "why are my pages bouncing?" and "why does my site get crawled so slowly?" and "why does Google index such a low percentage of my content?" and so forth.

I agree with @bernard in this regard. The answer is a higher quantity and a higher quality of links with natural anchor text profiles.

This is something that applies to every update since Penguin rolled out and to most problems people experience.

You can take these same "out-perform itself" and "over-optimized" aspects of link profiles on new sites and "in between updates windows" and apply it to on-page. I haven't said this on the forum but I've even seen keyword stuffing back into play between certain updates before the data gets crunched and they get whacked.

Matrix SEO Analogy
It's the same old game I've talked about since before BuSo existed. The Matrix movie is the perfect analogy for all of this. Neo is the anomaly in the Matrix. He's the weakness in every algorithm, the one exploitable variable that's weighted too highly. But just like in the Matrix, the Architect KNOWS that The One weakness will always exist, so the real move is to control him and get him to move and act as the Architect pleases. Then the next update comes, The One dies and is resurrected and a new vulnerability in the algorithm now exists. By the time you target it and exploit it, it's time for it to die and for you to be outted as a spammer, and for the weakness (the window of opportunity) to be moved to another spot on the wall. Anyone caught aiming at that window will find themselves splatting against the wall once that window moves again, and your splat on the wall is evidence you are a spammer.

The way out of this is to stop being targeted by the machine as a potential spammer. And the way to do that is to build trust and authority. The way to do that is:
  1. Let your site exist for over a year
  2. Don't give off spam signals on-page or off-page
  3. Build brand signals
  4. Get a high quantity and high quality of contextual backlinks with natural anchors
Do that and rinse and repeat on the same site update after update and you'll see the fluctuations in your properties during each update get minimized because your signals aren't being discarded during the offline data calculations they roll in. And you get leeway for being trusted and authoritative.
This advice needs to be sticky for updates, an amazing post by Ryu.

I am happy to report my traffic has more or less recovered. It went like this...update day 50% drops, next day 30% recovery, following day 20% recovery. I feel this shuffle-up penalized the sites that weren't high enough authority as Ryu mentioned but then user signals like ctr and time on page etc slowly boosted me back up over the next few days.
 
17 organic clicks on December 3rd, 3 organic clicks on December 4th. 510 organic impressions on December 3rd, 232 organic impressions on December 4th. Awesome- glad I wrote blog posts for 6 hours a day for over 2 months just to have my 100% whitehat website get destroyed for no reason.
Honestly I’ll buy your site if its decent and you want to sell... I’d do this because I know for a fact your traffic will come back. Ive already regained positions and I lost 65% of my traffic on the third of dec.

let me know if your interested in selling. I’ll scoop it right up. What is the age on the site?

I lost 65% of my traffic on december 3rd and 4th. The 5th and the 6th its regained to about 50% drop in traffic.

-----

I’m convinced this is slowly creeping back and reversing.. atleast to a traffic amount of where im not upset. While still very unmotivated, I’ve learned from the posts of @bernard and @Ryuzaki and a few others that it will all be fine and dandy if you do your due diligence and follow the simple guidelines of a strong site: Proper internal linking, content clusters, no pbns, age of site, backlinks, anchor texts, topical relevancy, strong silo, etc etc.

I went through my rank tracker and noticed garbage sites outranking me, it didn’t make sense. Many were 2-3 year old posts. I think this core update does favor aged sites, esp. the tech related niches. It appears these two factors are commonalities amongst builders of not only this forum, but many others I’ve seen complain
 
My performance after the update:
  • Domain: 5 years old
  • Website live: 3 years
  • Organic traffic Friday: -48% week over week
  • Organic traffic Saturday: -34% week over week
  • Organic traffic Sunday: -37% week over week
I lost ~70% of my traffic (organic) in May. Afterwards, it didn't move up or down until October. In Q4, I moved back to halfway April traffic levels. With this update and provided traffic continues the current pattern, I will have gained 16% versus June levels.

Put differently, in Apr I was at 15k, Jun until Sept 5k, Nov 7k and Dec looks like I will end with around 6k organic.

Could I be hit by EAT? Maybe. Could it be lack of links/authority? Yes. I didn't do enough link building this year because I focused on content creation, but... I literally started focusing a ton on link building in Nov. And I don't mean "two hours more a week" a ton. I mean I hired an agency to increase quantity of links a lot (while taking quality into account). So, this update feels extra sour. You know, "too little too late" and all that. I hope their work gets me to where I need to go next year.
 
Status
Not open for further replies.
Back