Google Algorithm Updates - 2019 Ongoing Discussion

Sutra

Breathe the body deep
BuSo Pro
Joined
Oct 28, 2015
Messages
657
Likes
614
Degree
3
I don't see how this should affect me though. I only have native language on my site, so it's only native. The only difference is the TLD is not native.
You may have been getting traffic from different regions e.g. You're in the US but were getting traffic from people in England, Australia, etc. but you have lost that traffic. Now, Google is sending that traffic to sites in England and Australia instead of sending it to yours.
 
Joined
Dec 31, 2016
Messages
624
Likes
452
Degree
2
You may have been getting traffic from different regions e.g. You're in the US but were getting traffic from people in England, Australia, etc. but you have lost that traffic. Now, Google is sending that traffic to sites in England and Australia instead of sending it to yours.
No, I only get traffic from one country.
 
Joined
Jul 5, 2019
Messages
62
Likes
65
Degree
0
No, I only get traffic from one country.
None from Swedish people living in Denmark or Norway (or other expat countries) then?

(Edit: or wherever it is that you live...)

Regarding strangeness in countries/languages, I just had Google offering me the Lower Saxon dialect version of Wikipedia https://nds.wikipedia.org/ as part of a search, despite the fact that I am nowhere near Lower Saxony and have no idea what their dialect is like.
 
Joined
Dec 31, 2016
Messages
624
Likes
452
Degree
2
None from Swedish people living in Denmark or Norway (or other expat countries) then?

(Edit: or wherever it is that you live...)
Yes, but only a few percent, so that's definitely not the cause.
 
Joined
Dec 31, 2016
Messages
624
Likes
452
Degree
2
I'm seeing some beginning signs of reversal/rebound.

While again a new page dropped today.

So it's still rolling out, but at the same time also correcting?
 
Joined
Apr 12, 2019
Messages
33
Likes
50
Degree
0
I'm seeing some beginning signs of reversal/rebound.

While again a new page dropped today.

So it's still rolling out, but at the same time also correcting?
That seems to be the trend with updates the past couple of years. Pages get an initial boost / drop and then regress / recover somewhat. Then the SEO community is left wondering wtf happened.
 

andreint

BuSo Pro
Joined
Oct 20, 2014
Messages
152
Likes
189
Degree
1
I'm seeing some beginning signs of reversal/rebound.

While again a new page dropped today.

So it's still rolling out, but at the same time also correcting?
Same here. One site dropped 40% on 8th, just to get +130% (vs weekly avg, last 60 days) yesterday AND today.

Weird shit
 
Joined
Sep 3, 2015
Messages
376
Likes
173
Degree
1
I believe snippets got adjusted over the weekend. At least that is what happened to me.
 

Ryuzaki

女性以上のお金
Staff member
BuSo Pro
Digital Strategist
Joined
Sep 3, 2014
Messages
3,766
Likes
7,156
Degree
8
I believe snippets got adjusted over the weekend. At least that is what happened to me.
There was also some kind of roll out of a content quality filter. It seems like posts are getting indexed and lingering around behind where they should be for a while, then every three months or so some offline data that was crunched gets added into the algorithm, and bam, posts land where they should. Then it's wait and repeat. (It's Panda).

Between this filter and what you're pointing out with snippets, I saw a decent traffic jump plus steady gains daily now. It's times like these that I believe Google is really starting to understand the quality of content beyond external signals like links. Either that or my on-page is dat crack. I think it's both. I always go above and beyond on content with the belief that it'll pay off more and more over the years as Google becomes sentient.
 
Joined
Sep 3, 2015
Messages
376
Likes
173
Degree
1
It's times like these that I believe Google is really starting to understand the quality of content beyond external signals like links.
Looking at my niche over the last 24 months I would have to agree. It's good to know because like you I also focus on UX and good content (as much as I can being a display ads publisher!)
 
Joined
Dec 31, 2016
Messages
624
Likes
452
Degree
2
There was also some kind of roll out of a content quality filter. It seems like posts are getting indexed and lingering around behind where they should be for a while, then every three months or so some offline data that was crunched gets added into the algorithm, and bam, posts land where they should. Then it's wait and repeat. (It's Panda).

Between this filter and what you're pointing out with snippets, I saw a decent traffic jump plus steady gains daily now. It's times like these that I believe Google is really starting to understand the quality of content beyond external signals like links. Either that or my on-page is dat crack. I think it's both. I always go above and beyond on content with the belief that it'll pay off more and more over the years as Google becomes sentient.
This is so hard to quantify though.

I would of course say that my content is better than the competition, but how do I prove it? Do I include more subtopics? Is my writing easier to read and flow better? Time spent on page?

I find it difficult to accept I've been demoted site wide due to content quality, just doesn't make sense.

Unless it stuff such as indexing archives and categories. Should I noindex them again?
 
Last edited:
Joined
Sep 3, 2015
Messages
376
Likes
173
Degree
1
I would of course say that my content is better than the competition, but how do I prove it? Do I include more subtopics? Is my writing easier to read and flow better? Time spent on page?
How easy it is to get the info they want/searching for
Layout/design
Ability to navigate page
Presentation
Grammar/spelling etc
Diagrams/videos/etc
Helpful links to other related guides/content

It might sound obvious but put yourself in your average visitor's shoes and visit your competitors vs yours.
 

Ryuzaki

女性以上のお金
Staff member
BuSo Pro
Digital Strategist
Joined
Sep 3, 2014
Messages
3,766
Likes
7,156
Degree
8
I find it difficult to accept I've been demoted site wide due to content quality, just doesn't make sense.

Unless it stuff such as indexing archives and categories. Should I noindex them again?
I didn't mean to imply anything about your case. Have you confirmed that you were demoted or could you have simply been bumped down by the promotion of other sites? It is a zero sum game after all.

My approach is to index the first page of the categories, show upwards of 25 to 50 posts per page, add a little static content to them, and noindex all subsequent sub-pages like /category/page/2/ and forward. Google is adamant that you don't need to do this, but I'd rather clean up my indexation as much as possible to boost my "sitewide content quality score," which I firmly believe exists and is calculated by Penguin. I got confirmation that it exists and posted it somewhere on the forum when I found it.

I think SEO used to be: off-page and on-page. Now I think it's equally spread among:
  • Off-page SEO
  • On-page SEO
  • Technical SEO
I'd audit your site with a fine tooth comb. Turn over every rock you can think of, from as big as indexation bloat to as small as "an image is missing an alt tag, my form is missing a label" etc. Typos, consistent usage of your brand name, broken links, redirect chains, everything and anything.

Fix and improve everything, no matter how big or small. It's what it took for me to experience this (and you all are probably as sick of me gloating about it as you were of me crying about it):


I believe 75% of that was plateau was an index bloat problem that went unnoticed for 2 years. When I really started tanking in August 2018 I panicked and did the "fine tooth comb" "the whole kitchen and the sink" "nuclear method" audit. 75% index bloat / content pruning, 10% user experience improvements, 5% god-tier page speed, 5% tiny things like image alt tags.

Trying to isolate and find the one or two variables is going to be impossible. Go nuclear, scorched earth and fix every possible thing possible, even if it seems negligible or invisible or a waste of time. Do all that at the same time, as fast as you can, and wait for the refresh. That's literally all we can do at this point. That's literally what unlocked my site from a plateau that nearly ended the site as far as my motivation for it went.
 
Joined
Dec 31, 2016
Messages
624
Likes
452
Degree
2
Yes, I'm going to make some changes. I have rebounded like 10% or more it seems. I was down pretty much exactly 30%, so maybe half that coming back now. Only down 15-20% or so.

My approach is to index the first page of the categories, show upwards of 25 to 50 posts per page, add a little static content to them, and noindex all subsequent sub-pages like /category/page/2/ and forward.
What I don't understand is that I use the canonical tag on all these subpages but they still index.

Other things I struggle with is the intent factor. Should I split "buying guide" vs "best product" into separate sections?

I have long content usually, which is the norm in this niche, but the update seems to have lessened that. The strange thing is that some people are ranking who have never done so in the 2 years in this niche. Not new sites either. Older sites that just never really mattered.

I have some ideas. Probably going to think hard about product descriptions. I admit those are weak.
 

Ryuzaki

女性以上のお金
Staff member
BuSo Pro
Digital Strategist
Joined
Sep 3, 2014
Messages
3,766
Likes
7,156
Degree
8
What I don't understand is that I use the canonical tag on all these subpages but they still index.
That's because canonical is only a suggestion, and pointing all the sub-pages to the first page is an incorrect usage. I'm not surprised Google would ignore that. Canonical is basically for syndication of the exact main content across different domains. Or stuff like faceted search results for eCommerce like ?color=green?size=XL.

Other things I struggle with is the intent factor. Should I split "buying guide" vs "best product" into separate sections?
Probably. I have a ton of old "best" posts that have buying guides that I'm not splitting up. I feel like I should, but the SERPs are still filled with ones with buyer's guides in them. So I'm following the lead of the live SERPs for now.
 
Joined
Jul 5, 2019
Messages
62
Likes
65
Degree
0
Here's a quick rundown on geographical/language shifts on a long-term site I've been looking at.

Year-on-year audience from September 1 to now: +24%

Main customer base (and language and site language): +7%
Main visitor base (and language and site language): +29%
Various other countries sharing site language as their main language: +9%, +10%, +3%
Location of server (does not share language): +61%
Location of topic (does not share language): -6%
Other countries where inhabitants may have substantial numbers who may use main site language as first (expat) or second (not a lot of people speak theirs) language: +60%, +43%, +40%, +53%, +85%, etc.

Only real difference is that site has had main language social media presence this year as opposed to last year (resulting in 4% direct traffic) and a few extra posts on its blog...

So, as I mentioned earlier, to me it looks like there is some funky stuff going on with geo/language targeting...
 

CCarter

If they cease to believe in u, do u even exist?
Staff member
BuSo Pro
Boot Camp
Digital Strategist
Joined
Sep 15, 2014
Messages
2,445
Likes
5,716
Degree
6
This is so hard to quantify though.
I'll say this, a lot of this is almost subjective. It's like an artist attempting to judge their own work. You need someone to give you a second set of eyes on your project since you might be missing some elements that are obvious to a person that has no bias for your project.

For example some of the result consulting work I did there was a ton of pagespeed problems and things like missing sidebars - some small things that all add up, but everyone thought their website was fine until it was pointed out by a 3rd party.

There are problems that I know I have on my own projects and never really "consider" them until someone else points it out and then I have the "oh yeah" moment.
 
Joined
Dec 31, 2016
Messages
624
Likes
452
Degree
2
Do you think this update could have something to do with Above the Fold content or quick user satisfaction?

I'm just thinking, all the speed stuff with Google specifically showing First Paint in their reports and then I look at my sites and it's the long posts that seem to suffer and Google specifically tells you not to exceed 1500 dom elements in their speed test?

In addition, some of those long pages did not have a typical comparison table at the top.

Then I looked at some of those sites who got a boost ahead of me and they, for the most part, seemed to have comparison tables up top. Some didn't, but that doesn't mean anything of course, because of the weighted factor stuff.

I'm leaning in this direction now. Going to try to think how I can do this better.
 

Ryuzaki

女性以上のお金
Staff member
BuSo Pro
Digital Strategist
Joined
Sep 3, 2014
Messages
3,766
Likes
7,156
Degree
8
@bernard, I think the way you're thinking is good. You're identifying things that other sites may be doing better than yours. And even if that doesn't help directly in the algorithms, it will improve other metrics that may play into the algorithm. I'd seek out as much of this kind of thing as possible and implement them.

What I think you shouldn't be doing is trying to find some kind of single variable that Google changed that may explain what happened. We're waaaay past that. Google is 21 years old now. The variables are in there now and none work or exist in isolation. That goes for the big broad variables as well as the tiny ones that might lead to intent classifications, etc.

I'd add that comparison table. It's good for users and will boost your clicks big time. Make the links open in new tabs so they can keep reading and try to understand the products more deeply (and also keep your on-site metrics going).

Sure, speed is important. I really doubt Google just flicked the switch on anything related to speed. It's been in play for years. But improving that will undoubtedly be good for users and the algorithm.

Like I was saying above, I'd recommend going nuclear and improving everything big, medium, small, and astronomically tiny. Fix everything and make everything better. That's the only way to undo the damage of an algo change. You'll never actually find the variables they changed, but by improving every variable you can assure yourself and the algorithm that you fixed up the problem.

The beauty of this is you only really need to do it once. If you do it to full completion, then later you can just run simple audits and spidering to cover the basics to make sure nothing crazy is slipping through the cracks (like some weird indexation).

And once this is done, you'll have eliminated a ton of possible things that would be holding you back. So then, if you get a negative impact from the algo then you know it's nothing obvious like bad user experience and nothing catastrophic like indexation problems.

You can know that it's just the natural re-weighting of the variables that everyone gets caught up in. And from there the game will be simple. Just keep creating content if applicable, otherwise keep getting links.

Getting the technical SEO part of the equation of the way is entirely possible and will simplify everything else. I think this is a core thing that's hurting sites these days as Google tries to weight technical variables to have more to measure to improve SERP quality. And since nearly everyone is using pre-built themes and crazy amounts of plugins, there's bound to be plenty that can screw a site around.

I think it works like... Technical SEO = better for users in terms of bandwidth usage and CPU usage, and better for Google's spiders in regards to crawling. They'll save a ton of money if they can ultimately convince everyone to get their tech SEO together. A great place to start is spidering your own site and looking at what's indexed in Search Console and with a site:domain.com search and make sure those numbers are reality based. And then look at internal links where spiders go that you wouldn't, which indicates problems. Like faceted search, weird comment links, etc.
 
Joined
Dec 31, 2016
Messages
624
Likes
452
Degree
2
@Ryuzaki Thanks for the suggestions, I will go dig up your posts on the topic.

I've seen quite a big upwards surge today though. Not all keywords but many. Back to around 80-90% of original rankings.

I will still do the full site check, but this could suggest speed being a parameter for me, since I fixed that glaring PHP issue.
 

Stones

BuSo Pro
Joined
Nov 3, 2018
Messages
43
Likes
34
Degree
0
One of the sites I are managing has responded to work done over the last couple of weeks. We went nuclear on it and did everything from speed optimisations, to interlinking, to adding content and also diluting anchor texts. Pleased to see positive movements over the last 48 hours.
I do wonder whether it was the work done or just Google tweeking another dial.
 
Joined
Apr 12, 2019
Messages
33
Likes
50
Degree
0
I also recommend the 'going nuclear' thing. When I've done this, it's typically taken 2-3 months and/or a couple Google algorithm updates for the changes to affect the SERPs.

Also, I really recommend Screaming Frog to assist with this. You're bound to break some shit. Screaming Frog will help you easily identify it.

Quoting @Ryuzaki here:

There was also some kind of roll out of a content quality filter. It seems like posts are getting indexed and lingering around behind where they should be for a while, then every three months or so some offline data that was crunched gets added into the algorithm, and bam, posts land where they should. Then it's wait and repeat. (It's Panda).
This definitely happens and it is infuriating. Especially when it's very, very low or non-existent competition and I post on a strong domain.

That's why patience is key in SEO
 
Last edited:
Joined
Dec 31, 2016
Messages
624
Likes
452
Degree
2
@Ryuzaki

I think I'm also looking at some of the same you did. All kinds of "crawler bloat".

Good thing is I don't have an errors outright, but this is my Google exclusion report:

111 indexed

Excluded:

1325 noindexed
527 blocked by robots.txt
356 crawl anomalies
119 404-pages
85 redirected pages
48 soft 404s
11 crawled not indexed yet
11 duplicate pages (weird, these are redirects)

That doesn't look good does it?

Most of it is caused by wrong settings on the Custom Posts ( need to set Public Queryable to False, I think). Then I need to go ahead an nofollow all the Pretty Links in the posts. They're "nofollow, noindex" in the redirect, but I don't want them to actually follow them at all. Got to look into those "crawl anomalies" as well.
 

Ryuzaki

女性以上のお金
Staff member
BuSo Pro
Digital Strategist
Joined
Sep 3, 2014
Messages
3,766
Likes
7,156
Degree
8
@bernard, those pretty links can result in blank pages being indexed, which is what happened to me. Definitely nofollow them to control crawling (though Google has said they're now going to crawl them to decide how to use them, but should learn they're affiliate links and stop crawling them). One thing you do NOT want to do is block crawling of those in the robots.txt or they'll index the blank redirects as pages.

Everything else that's noindexed, blocked by robots.txt, 404, etc... you'll want to update the links pointing to them to be nofollow in hopes to stop them from crawling those links. You're probably not seeing crawl budget issues but I think being efficient for Google's sake is a great idea, something they'll reward more and more in the future as crawling and indexing costs keep skyrocketing exponentially. They may already as some small ranking factor.