Google Algorithm Updates - 2019 Ongoing Discussion

Status
Not open for further replies.
What you can do though is look at Pagespeed Insights. It gives you more than just the Lighthouse score, showing you metrics like First Contentful Paint, First Meaningful Paint, etc. Display Ads that are asynchronous will have no effect on these, and these are definitely the ones the Google algorithm uses as opposed to some coagulated metric that includes a full download time, which isn't relevant to users at all. All that matters is when stuff appears on the page and when it's usable.

Thanks, TBH I have always scored awfully on Pagespeed Insights and always assumed it was because of ads (even First Contentful Paint, First Meaningful Paint, etc are very bad, like 9 seconds).

Based on what you said I just ran a test on the same site/page with ads disabled and while my scores weren't as bad, they were still very bad (like 5 seconds plus).

I think maybe my theme has a lot of scripts (paid theme from Themeforest) and I use Google Fonts too.

I was pretty shocked as I thought my setup was ultra optimised and quick but for my display ads (dedi server, nginx cache, etc).

Think I need to work on this... Another thing that fooled me is pagespeed always saying 9 seconds, but when I load the site on my phone (with ads) it's literally a second and I can interact with the content (cached or not). So I disregarded what pagespeed was telling me.
 
I checked my stats this morning and noticed traffic was cut in half as of June 3. First place I thought to look for clues as to why was here, and sure enough this community was on top of it. Thanks BuSo - ever vigilant. I got work to do.

  • Content Pruning - deleted about 150 posts, which was about 40% of the total content
  • Indexation Cleanup - stopped indexing "paged" pages of archives and fixed like 800 robots.txt blocked pages that should have been noindexed.
  • Speed Optimization - got even faster, now scoring a 100 on the lighthouse test.
  • Design Stuff - Made my Desktop and Mobile menus way better. Cleaned up Sidebars.
  • Crawl Budget - Working on this more now that I have the robots.txt changed, need to add nofollow to a ton of stuff. In process.
  • 301'd Domains - I doubt this is related but I'm including it. I killed off three 301'd domains. One wasn't niche-relevant at all though, so maybe it did play a part.
There were things I was going to do that haven't left the staging server yet, like reducing the number of display ads, updating every post (I'm converting what's left after pruning to gutenberg blocks), etc. I'm still going to push all this out so we'll see how that plays out.

Using this as a blueprint
 
  • Indexation Cleanup - stopped indexing "paged" pages of archives and fixed like 800 robots.txt blocked pages that should have been noindexed.

When you noindex a page, does it still pass on link strength? When I noindex everything after page 1 of categories and there is a post which is on page 2 and has no other internal links, is it considered orphaned?
 
When you noindex a page, does it still pass on link strength? When I noindex everything after page 1 of categories and there is a post which is on page 2 and has no other internal links, is it considered orphaned?

Pages can be noindex'd but still follow'd. That means spiders still crawl it, still find those pages, still flow page rank. So no, that post on page 2 isn't orphaned. There should also be the case that Google already knows about that post and thus will re-crawl once every blue moon, because everything in the index in rechecked occasionally. It should also exist in your sitemap that they should know about, which will also trigger re-crawls.
 
So my Monday-to-Previous-Monday was a ~20% gain.

Today, the Tuesday-to-Previous-Tuesday was a 40% gain!

cpuy4a4.png


How are you guys faring? Any other boosts? Any catastrophes? Anyone seeing nothing?

If you saw a boost, do you think there was anything in particular you did to help usher in the boost? Were you impacted negatively from the "medic" updates before? I was, glad to see myself getting some gains back.

As far as what I did since last August when this all started going downhill for me:
  • Content Pruning - deleted about 150 posts, which was about 40% of the total content
  • Indexation Cleanup - stopped indexing "paged" pages of archives and fixed like 800 robots.txt blocked pages that should have been noindexed.
  • Speed Optimization - got even faster, now scoring a 100 on the lighthouse test.
  • Design Stuff - Made my Desktop and Mobile menus way better. Cleaned up Sidebars.
  • Crawl Budget - Working on this more now that I have the robots.txt changed, need to add nofollow to a ton of stuff. In process.
  • 301'd Domains - I doubt this is related but I'm including it. I killed off three 301'd domains. One wasn't niche-relevant at all though, so maybe it did play a part.
There were things I was going to do that haven't left the staging server yet, like reducing the number of display ads, updating every post (I'm converting what's left after pruning to gutenberg blocks), etc. I'm still going to push all this out so we'll see how that plays out.

I'm about 90% sure this includes a Panda data refresh.

@Ryuzaki I'm thinking about doing some cleanup like this. Have a few questions if you don't mind:

  1. I'm thinking about pruning some content, specifically the stuff that is getting zero search traffic. However, even though those posts don't get search traffic they do get traffic from my newsletter subs and from my Facebook pages. Do you think no-indexing the posts that don't get search traffic will work just as good as deleting them?
  2. I'm a bit confused about what you meant when you said, "stopped indexing "paged" pages of archives" and also about "fixed like 800 robots.txt blocked pages that should have been noindexed." Can you clarify what you meant?
 
Do you think no-indexing the posts that don't get search traffic will work just as good as deleting them?

In terms of any Panda situation you may have going on, yes. It only looks at content in the index. But only delete or noindex low quality content. A page may not get traffic but still be high quality. It could be that you didn't do great on-page optimization or it was a time sensitive topic nobody cares about any more. In those cases a 2500 word post could be high quality but get no traffic. A 300 word post could be high quality too if it fills the needs of whatever the query's intent is. Like a definition, or a small recipe, etc. Just think clearly about what's high quality or not, and also take a look at the links of these pages.

"stopped indexing "paged" pages of archives"

Paged pages are those of archives that exist past the initial page. An archive is a category, a tag page, an author's profile page, a list of posts ordered by date, etc. It's your typical category style page with pagination.

So a paged page would be the 2nd, 3rd, 4th, ... 100th page of that pagination sequence. It's every page that's not the first page. I'm noindexing them because I don't want to bloat the index with them. Google has said it's fine to have these, just so you know. It's just a choice I made. They'll have the same title tag, meta description, all the content on them is duplicate from other parts of the site (post titles, excerpts, same category description). Again, Mueller said it's fine. I was just being as aggressive as possible.

"fixed like 800 robots.txt blocked pages that should have been noindexed."

This was a problem on my end. I set up a system of .com/out/product-name style redirects for affiliate links so I could hot swap them across the site without having to change them everywhere I used them. I can just change the destination in a text file and it switches sitewide for me.

I set it up so each redirect threw up a noindex HTTP header so they wouldn't be indexed. Somewhere along the line I blocked the folder in robots.txt so Google wouldn't crawl them (I should have used nofollow but this was faster). This meant Google couldn't see the noindex, so they started indexing blank pages with no title tags or meta descriptions or any content.

My fix was to stop blocking the folder so Google could recrawl and see the noindex header. It worked perfectly, just took a while to get them out of the index. There's still like 10-20 to go.

_____

June 2019 Broad Core Update Effects:

My +20% then +40% turned back into a +20% by the next day. Over the weekend it went down to around a +3% boost. We'll see how it plays out this week but it looks like that initial boost might have just been the chaos of the initial rollout. I hope to see some gains stick. If I see no overall negative effect, I consider that a victory too.
 
Our largest site took a 60% hit in traffic as of June 4th. Had just switched over to Ezoic for display ads at the same time and am curious if too many above the fold ads got us the hit or we indeed got hit in the update.

If we were hit in the update I'd expect it to bounce back "soon" but time will tell..
 
https://www.ccn.com/ccn-is-shutting-down-after-googles-june-2019-core-update
While CCN is unbiased as a news site – meaning no political agenda from the editorial board – we have allowed all sorts of opinions from journalists and guest writers. From the far left to the far right. Many of our political Op-Eds have been “Pro-Trump.” Remember: We are mainly a crypto centered news site. Cryptocurrencies started with Bitcoin in 2009 as a counteraction against the financial meltdown and the cartel-like power possessed by banks and The FED (which is privately owned). Bitcoin is, or at least was supposed to be, DECENTRALIZED. We are true to the crypto community with our slogan: We Are Anti-Elite, We Are Anti-Centralization.
While I won’t speculate whether or not this might have affected our site, I would certainly hope Google isn’t actively suppressing journalism.
 
Saw the CCN post and seems ultra extreme! Surely they could put the site on life support and wait/try for a recovery.
 
Saw the CCN post and seems ultra extreme! Surely they could put the site on life support and wait/try for a recovery.

Yeah, it seems like linkbait. I doubt they shut down. Even if you were done with it, I'd leave it live and take the free money. Then again, that company bought Examiner.com and took it down and never used it again. Who knows.
 
Do you think there could be any truth to this being an "anti-alternative media" kind of thing?
 
Do you think there could be any truth to this being an "anti-alternative media" kind of thing?

I do. They're absolutely in cahoots with traditional media to choke out small media. They use it as an excuse to clean up their results. "If we up-rank these 'trusted' sources, that will down-rank non-trusted, hateful, conspiracy sources" (aka big companies they aren't colluding with). It's not just a big tech and mainstream media issue, it goes past that to narrative control and controlling the flow of information on a national and international level.

I actually went into my sites' Google accounts and deleted all the history from search and from YouTube's watch / like list in case I had enjoyed any conservative media. There's zero question they're biased. The real question is will they go far enough to write algorithms to read all that data and down-rank anyone for wrong-think? Considering they're firing their own workers if they find out they aren't radicals on the other side of the pendulum, then yeah I think it's slightly realistic, realistic enough to take precaution.

In the case of CCN, I think they're just drumming up apocalyptic rumors for link bait. It's quite a stretch to consider a cryptocoin site a conservative journalism endeavor.
 
I actually went into my sites' Google accounts and deleted all the history from search and from YouTube's watch / like list in case I had enjoyed any conservative media. There's zero question they're biased. The real question is will they go far enough to write algorithms to read all that data and down-rank anyone for wrong-think? Considering they're firing their own workers if they find out they aren't radicals on the other side of the pendulum, then yeah I think it's slightly realistic, realistic enough to take precaution.

I don't talk politics here, but I do watch quite a lot of "alternative media" and now I'm worried about that. Are we really at a point where it would be prudent to strictly separate private and business surfing? I've begun using Torbrowser and Brave now, but my search history is still there. Time to delete it huh?
I find downrating Daily Mirror while upranking Mirror.co.uk to be suspect. I don't believe they could do a "conservative media" algo, but they definitely could do a "not trusted news source" algo change and IF they wanted, they could clean out some sites like Daily Mail manually and blame it on the algo. I don't know how far down that rabbit hole it goes though.
 
FWIW, neither of my sites got hit. Everything looks stable.

A goal of mine is to completely rid myself of programatic advertising. I'm curious if getting rid of AdSense or one of their partners would negatively affect my SEO. At the end of the day, they want to promote their product, which is AdSense. Why wouldn't they put an AdSense site above a non-AdSense site, all other things being equal?

I still have the naive thinking that they are better off being legit with their rankings. After all, people want the best search results. Duckduckgo is good, but I still find myself going back to Google on occasion when I get stuck. They just have better results in some cases.
 
For what it's worth I thought I was maybe 5-10% down as a result of this roll-out. My traffic today/yesterday is pre-update levels though. Go figure.
 
I had a record number of organic searches and clicks on my main site, despite small rankings drops.
 
CCN Markets is Rising From the Dead:
https://www.ccn.com/ccn-markets-is-rising-from-the-dead/

They're claiming that somehow their old domain CryptoCoinsNews.com, which redirects to CCN.com, is getting all the visibility that they're losing from CCN.com. I wanted to type "Publicity Stunt Confirmed" but if you do a site: operator search you can see that both are indexed.

They've done something wrong that I'm not interested in digging into, but it's a classic case of having multiple versions of a site/content indexed, without proper redirects. I see canonicals set up, but otherwise I don't know. They've goofed somewhere.

There is the issue of their visibility not actually growing on the other domain:

Pfv8uUs.png


Looks like they ran into Panda, which helps confirm my suspicions I posted earlier in the thread:

I'm about 90% sure this includes a Panda data refresh.

Anyone who took a hit needs to look into their indexation, content pruning, etc.

EDIT:
Someone (Dan Shure) on Twitter pointed out that CCN was getting what I'd guess is 60% of their traffic from one page /bitcoin-price. Another 30% was from a 2nd page. The last 10% is probably across the rest of their articles.

That main 60% page dropped from position 3 to position 7 for the main term "bitcoin price," for which their backlink profile is highly over-optimized (but probably still natural to some degree).

Coinbase took #1 and their anchor text profile for their page is much more spread out and natural. They also get 190,000 brand searches a month while CCN gets 27,100.

I see CCN at #5 AND #6 so they're still getting a fair chunk of traffic. I'm sure that this page slipping played a role but they definitely have much bigger problems to solve or they're really going to hate Google on the next update. Panda slowly decays your traffic over updates until you're left with a tiny trickle.
 
@Ryuzaki I took a big hit Aug 2018 and slowly gained about 25% of the loss back. But with this June update I lost that 25% gain.

Do you think it could have anything to do with internal linking? My internal linking is pretty horrible on a lot of pages (maybe 500 out 1000 pages have bad internal linking), mostly leftover from the times when I was trying to drive pageviews for ad revenue. Have tons of things that are linked just because the word is the same, rather than being relevant.
 
@Sutra, I honestly couldn't say. I haven't done that type of interlinking in many years. If I had to guess, I'd think it's not hurting you significantly if at all, but you may be leaving some boosts on the table. It wouldn't hurt to map out your posts and readdress the interlinking eventually. I definitely don't think it would account for a 25% drop.

I would absolutely run a site:domain.com search and note the number of pages indexed on the first SERP page, then click all the way to the last SERP page and note that number there (they'll change). Then I'd go to Search Console and compare the indexation in the Coverage report to these numbers, and compare all that to the number of pages you're submitting in your sitemaps (while taking care to take into account "pages indexed but not submitted in sitemap" which will be stuff like pagination but also if there's problems you'll find them here and in other places. Dig around in the Coverage report).

I really do think Index bloat is being addressed since around August 2018 and I think a lot of people have indexation problems. Trash in the index is going to drag your quality score down.
 
Small sites with above average engagement on 3 word keyword domains are op.
EMDs are getting a lot of niche immunity atm.
Partial match is sometimes getting it, but not always.
Really suprised more people arnt taking advantage atm.
 
Small sites with above average engagement on 3 word keyword domains are op.
EMDs are getting a lot of niche immunity atm.
Partial match is sometimes getting it, but not always.
Really suprised more people arnt taking advantage atm.
I had an EMD that killed it for several years. Got DESTROYED about 2-2.5 years ago and I am fairly certain they did an EMD update around that time. I wouldn't have thought they would reverse that.
 
Yeah they did.
It cost me a fortune.
They also went on a network smashing rampage.
Nothings ever going to compare to those windows where hurrr durrrr 3000 words of 2 cent content and some network links works.
Its as busted as I've seen things get, post black and white animal stampede.
 
@Sutra, I honestly couldn't say. I haven't done that type of interlinking in many years. If I had to guess, I'd think it's not hurting you significantly if at all, but you may be leaving some boosts on the table. It wouldn't hurt to map out your posts and readdress the interlinking eventually. I definitely don't think it would account for a 25% drop.

I would absolutely run a site:domain.com search and note the number of pages indexed on the first SERP page, then click all the way to the last SERP page and note that number there (they'll change). Then I'd go to Search Console and compare the indexation in the Coverage report to these numbers, and compare all that to the number of pages you're submitting in your sitemaps (while taking care to take into account "pages indexed but not submitted in sitemap" which will be stuff like pagination but also if there's problems you'll find them here and in other places. Dig around in the Coverage report).

I really do think Index bloat is being addressed since around August 2018 and I think a lot of people have indexation problems. Trash in the index is going to drag your quality score down.

Did what you recommended and found this:

  • The amount of pages indexed in coverage are little over 1k, which matches the amount in the sitemaps. (In case it matters - last year I was hit with the Yoast indexation bug but ran their fix and it looks like all those extra pages were deindexed - so now the correct pages are indexed)
  • When I do a site:domain.com search in Google there are only 300 results. However, at the top of the screen it says "showing 10 out of 1000 possible" results. I did a search for specific pages that were missing from the list of 300 and they did actually show up. So it seems they're indexed but strange that google only shows 300 results with the site:domain.com search. Does this indicate a problem on my end?
On separate note, but something that may be a problem:

I ended up finding a bunch of pages on my site where the article images are linked to the category that article is in. There's anywhere from 100 - 200 pages that have these types of category image links. Some of the articles have 1-2 images linking to the category, while others have 25+ images linked to the category. In at least one instance I found one article, with 25 images, linked to a defunct category. So when you click the image it goes to a category that's completely empty. No articles in that category at all.

I thought I remember us briefly talking about this years ago, and you said that it's best to not link images to categories like mentioned above because at some point in the future it could trip the Panda/Penguin filter. Do you think this could be causing the initial traffic drop in Aug 2018, and now again in June?
 
When I do a site:domain.com search in Google there are only 300 results. However, at the top of the screen it says "showing 10 out of 1000 possible" results. I did a search for specific pages that were missing from the list of 300 and they did actually show up. So it seems they're indexed but strange that google only shows 300 results with the site:domain.com search. Does this indicate a problem on my end?

Nah, this is why I was saying to click through to the last page of the SERP results. The last page usually shows the correct number of indexed pages. Often you'll see at the bottom of the other pages that you can re-run the search with omitted pages. But even when you don't see that, it's the case that the numbers aren't accurate and Google knows about it. I guess it's been a low priority fix. The last page gives you a better idea. The Coverage Report seems to be 100% accurate.

Do you think this could be causing the initial traffic drop in Aug 2018, and now again in June?

I don't know. I wouldn't have 25 images on one post all linking to the category. I wouldn't be doing that anyways. All you're doing there is cycling juice to the newest posts (by aiming at the category) instead of across all posts or to money posts.

But yeah, I'd definitely run a crawl and find any internal links aimed at categories with no posts or pages that don't exist. I'd delete those categories and fix the 404 links. I'd do the same for external links and fix any 404's. I imagine this is harming your crawl budget and definitely affects any kind of quality score, especially internal 404's and wasted links. Externals probably get a bit more lenience.

But no, I don't think there's any one major factor affecting our sites in these broad core updates. I think we have to turn over every possible stone and fix every possible problem. Technical SEO seems like an obvious place for Google to get more gains in their algo and Technical SEO is getting more technical as CMS's evolve (bloat) and what not. Almost everyone is ignoring it due to lack of knowledge and skill.

I'm not just talking about things like "missing image alt text" and "missing meta description." I'm talking about things related to 1) crawling and 2) indexation. I think there's huge gains to be made there for most of us, and I think Panda deals with that kind of stuff far more than any of the bloggers or agencies ever realized.

I think of Panda as a quality score for your site, sitewide, based entirely on what's in the index and nothing more. If you have a potential for 100k traffic a month but your quality score is 85%, then you only get 85k a month traffic. It might take you 200 more articles posted and ranking to make up that 15k, or you can do an audit of your site and fix some problems and get your quality score up to 95%, you know. Then you get more mileage out of all of your old content and links, and out of your new content and links.
 
Status
Not open for further replies.
Back