Indexing and Google's Core Updates, Product Review, Helpful Content, & Spam Updates

@LoftPeak, Lol that's extreme, maybe ping Mueller on Twitter once you're finished with the sink stuff with the screenshot?

I had the same on 10% of my previous site in May/June 2021, recovered fully March 2022 and hasn't been affected since. I tried everything to get those pages indexed back - links, topical relevance, content refresh. And nada - just had to wait 9 fucking months. Fortunately came back stronger because of links. Was fresh domain though.
 
I'm in the same situation as many others here, can't find any reasonable explanation as to what really happened from the data I'm looking at.

Here's some info about my site:
- Roughly 1.5 years old
- A few over 300 articles (1000-1500 words), all written by me, all informational, and mostly zero comp
- DA fluctuating around 15-20
- No link building or any naturally attracted links to write home about

GSC:
HITC34S.png


The first hit is dated September 1 (HCU), where I lost roughly 35% from peak. This time around, the traffic loss was primarily from a small shift in rankings on almost all KWs, with a lot of KWs going down 1 to 3 spots.

The brief upward spike is September Core (+15-20%), and the small hit back down is PRU (not a single affiliate article or affiliate link in the site).

Finally, the last big hit is October 19 (Spam), where over 200 of my tracked keywords (each keyword matching one article) completely disappeared from the top 100, but the posts are still being indexed.

Now, I have a small list of keywords that kept their position at top3 and never fluctuated, and a large list of keywords that keep going in and out, with some popping back to their original positions, and others all the way down to the top50-100 range.

Among the lost keywords, there are some that went in and out multiple times (but never stayed in their position longer than 24 hours) and some that never popped back since the 19th.

The strange part is that I notice a keyword that has never popped back before coming back every other day, but the bulk of the shifts in my tracked keywords come from the keywords that are always coming and going back and forth.

For instance, if I see 30 keywords gained in serprobot (and usually 30 keywords that are lost in such case), maybe 28-29 of them are ones that just keep going in and out regularly without any improvement, and 1-2 are keywords that have never popped back in before.

When all is said and done, my site is down roughly 60-65% from peak.
 
After days of digging into my site and every single one of my competitors within the same niche, I may have spotted some patterns. Trust me; I've gone over everything about each site from the ground up with things like EAT, links, internal links, anchors, and site structure; you name it, and I can confidently say I've checked that.

Most of those things vary from site to site, so there's no definite way to say those things are the factors that could've lumped our sites in "some filter" google may have developed. They may all be contributing factors or not at all, but there's no way to know for sure.

So I dug a little deeper and started to see something that is obvious, but we may consider it so small that we tend to overlook it.

Bear with me...

Disclaimer: This is just my hypothesis; it might be wrong or right.

First, I want to point out something from the discussions in this thread; it's somewhat similar to the comments I'm seeing across other platforms and reports of this happening. But look, I want to start showing the pattern:

I had this problem with a site I worked on for like 5 years straight

You can see the plateau and eventual erosion of traffic

my main project lately where I've consistently published 6 posts, all on the same day, every week for about a year now

I haven't got a good explanation for anything, but what I'm seeing is that instead of January/February numbers, I'm seeing late June numbers.

My traffic drop wasn't as severe as others, but I can definitely feel the effect of 100k fewer visitors a month.

I will say, the only thing I did during my drop was build some branded anchors as I was a bit over-optimized due to a 301 from a while back.

n all my years of SEO - I have never seen a keyword drop from 1 to oblivion (at least without being penalized) and for this to happen to multiple sites + multiple keywords. It was like a site wide penalty but on multiple sites - even my competitors took a hit (I checked)

For example, we had 200 pages at the peak with say 50k in traffic in May. The number of pages started decreasing slowly to 198, 195 then back up to 196, etc.. but it wasn't a big drop that you would notice immediately.

We are still pumping quality content and getting natural links as usual, but it's clear that more and more pages are being indexed and therefore the organic traffic is growing.

It seems like this proves the theory in @tyealia's post where the historical data was reflected in the index, but then this slowly degrades as new data is collected and added to the algo.

I am starting to see recovery on a few sites.

Some sites are recovering faster than others

Even though I know its not our fault all this happened - it's still a bummer when you see your data trending downwards.

In the end, we continue to push quality content and start focusing on other traffic sources to offset the risk of Google crashing us again.

The first image is my Main site. About 10 years old and about 360 live articles.

As you can see in the chart at end of May it tanked. Then came back for short time. Tanked again, then back again in Sept 26, then tanked again just recently end of November.

She just posts her articles and done.

Google has had a lot of bugs and problems in the past 4-5 years but nothing of this magnitude, and I doubt something like this would be allowed to persist this long if it was a bug. I'm assuming it's intentional but for what purpose, we don't know yet.

I feel your comment about indexing vs serving is likely what's happening.

Another one is to use as many entities as you can in your content. Nouns, Proper Nouns, organization names, whatever. The best way to do this would be to go through Wikipedia articles for the main "thing" in the topic you're writing about and looking at all their internal links and "See Also:" articles. Include all the nouns from there. This establishes that you have depth and breadth on your site about your vertical and niches.

I can imagine the indexing system telling some “sorting / ranking system” that a page is meant to rank during a query search

My unrefined hypothesis is simply that either they have an issue in the system they don't want to announce, or they're having problems with some new classifier and the thresholds are catching innocent sites.

I've also been fighting with this issue over the past year and, like most people, haven't been able to find anything that even remotely points in the right direction.

I've had the site since 2020 with around 200 articles now, built on an expired domain with no AI content, and no backlinks bought.

Roughly 1.5 years old
- A few over 300 articles (1000-1500 words), all written by me, all informational, and mostly zero comp

Now check out these images:

kCujSo9.png

This first site is actually going back down now.

dII4txV.png


KKosziX.png


CAJ5iUI.png


From each image, you can see the site timeline and have a general idea of the content age. You can also see that they've all reached some plateau and since then it's been a slow but steady decay in traffic.

But does that really matter? Not in all situations...

Still, let's hear it from the horse's mouth when this started back in May-July. Here: Google's Comment on Core Updates

Pay Attention to this generic guidance points:

There's nothing wrong with pages that may perform less well in a core update. They haven't violated our webmaster guidelines nor been subjected to a manual or algorithmic action, as can happen to pages that do violate those guidelines. In fact, there's nothing in a core update that targets specific pages or sites. Instead, the changes are about improving how our systems assess content overall. These changes may cause some pages that were previously under-rewarded to do better.

One way to think of how a core update operates is to imagine you made a list of the top 100 movies in 2015. A few years later in 2019, you refresh the list. It's going to naturally change. Some new and wonderful movies that never existed before will now be candidates for inclusion. You might also reassess some films and realize they deserved a higher place on the list than they had before.

The list will change, and films previously higher on the list that move down aren't bad. There are simply more deserving films that are coming before them.


To check if this was the case, the best thing to do is to check the pages impacted plus the ones that are still ranking well on my site and my competitors (the ones that are fluctuating too) to see what's up with out content vs what's ranking when our site is missing from the SERPs.

Here's some of what I saw in cases where mine or my competitors keywords disappeared:

Article 1:

DJBi6JJ.png

Article 2 Competing with High DR sites with more links:

1AZIhN7.png

Article 3 Competitor Example:
Xrs4XTM.png

Article 4 Competitor Example:
yxcn9YP.png


In situations where the pages have a high content score but still fall off:

9EQoM7i.png


Content Score is decent, but the top pages are doing a bit better, with one page doing extremely well. Keep in mind that in 80% of cases, the page in purple is older. Even so, the page falls short on other metrics.

EvznoAX.png


In situations where the content score is good and speed is good, other factors may come into play, and these pages may be "collateral damage."

In the case where the page remained at the top of the SERPs, I've checked 15 of my 25 pages still ranking top 3, and all show a similar picture like this one:

a5sN4rt.png


uRyIJrJ.png


fALuDZJ.png


Another common thing I noticed between my site and my competitors from the Core update propaganda page is that we're guilty of these things:

Presentation and production questions​

  • Does the content have an excessive amount of ads that distract from or interfere with the main content?
  • Does content display well for mobile devices when viewed on them?

Comparative questions​

  • Does the content provide substantial value when compared to other pages in search results?(think about the NLP terms in surfer that affects the content score, does your site cover most of them which would increase the quality of the page to Google)

Expertise questions​

  • Does the content present information in a way that makes you want to trust it, such as clear sourcing, evidence of the expertise involved, and background about the author or the site that publishes it, such as through links to an author page or a site's About page?
Maybe none of these things matter, maybe some do, but it's something that can be fixed.

To Sum it Up:
  • This may be a simple case of content age decay triggering a filter for their serving systems.
  • New content now provides more information and in a way that google prefers (more NLP terms on the pages)
  • In some cases, my site had the most optimized article back then. Now I see where that's no longer the case.
  • In the case of product reviews, since the content is outdated, the top-ranked products that were typically the same across all sites now only exist on my site or competitor sites that got hit.
  • Just hitting the update button doesn't seem to be helping, as google might actually be comparing things on the page(NLP terms) as their algorithms get smarter.
  • In cases where content scores are closely matched, links, images, and web vitals might be another thing they are considering to determine whether to serve the page.
  • I've seen where fixing page speed and adding images have helped get content on another site of mine indexed which previously wouldn't get indexed no matter how much I requested indexing.
  • Having too many ads (over monetization) in a way that affects user experience might not be a good thing if all the other negative conditions are present (old content, slow page, low backlink, lack of authority (Topical or EAT).
  • New fresh content that's well optimized seems to be sticking in the SERPs, the only new content that isn't are the ones with low content scores.
  • Keep in mind that not all articles will have competitors coming in and writing better pieces, but most will, so you might find rare occasions of traffic staying steady with just a slow decay on these pages.
  • It could still just be some bug, but that doesn't explain why all of the sites show a steady decline in search traffic (think of it like a dying heartbeat when you check your analytics).
That's all I saw and the only thing I see as something to fix in my case. Your case may be different, but check on it and see if you notice these patterns.
 
Last edited:
@LoftPeak Interesting!
If we hypothetically think that google have applied a freshness filter of some sort, when would you say that your content is out of date looking at your graphs??
 
@LoftPeak Interesting!
If we hypothetically think that google have applied a freshness filter of some sort, when would you say that your content is out of date looking at your graphs??
On content older than one year going to two years old, I'm assuming.

At that point, the filter is applied during, right before, or just after the end of any new update on; it checks the content on the first page of the SERPs to find any outlier whose content is aged and not at the same quality(content score) or within range as the pages already there. So the page being out of date might not be the only factor with the older content; what if the content for the top-ranked pages had completely shifted or improved with new data (NLP terms) from when you first wrote that article?

The only outstanding question to answer is why the pages return back to their previous spot periodically.
Might be a host of other factors, such as their testing of the results between the newer content and the old content and figuring out which is better to serve.

Typically the older content has more authority through links and user metrics. But when you check for NLP terms, it's always those pages that are lacking when they pair them with the new content.

For some sites, links might be the crutch holding that site up. But what if all the competitors have similar page-level links, and your content is old and outdated compared to the new content?

The only way to further verify this is to wait and hope to recover back to previous positions for those keywords, then do a content score test between your page and the ones that are ranking when the site is back being served.

I've also done some checks on my only other site that took a dip like this, and it's also the only one with outdated decaying content. The thing is that it has a content score that's somewhat on par with the other pages in the SERPs, albeit a little lower. But I make up for it in page speed and maybe some other on-page metrics here:

V1iJ6de.png

I must point out that I had once completely optimized the articles I was testing after they aged for 6-9 months, way back in 2020-2021. So to see these content scores, you know things have changed.

Page Speed Comparison:

yaXgrX5.png


1QbnYlG.png


All my posts are like this, and it's the only correlation between the two sites in my portfolio that has seen this effect.

Until I get further verification, the only thing to do is to ensure that your pages are up to date with new information and matches the content score and page quality metrics of your competitors. At least, that's what I'll be working on since there's nothing to lose from doing so whichever way.
 
Now, I have a small list of keywords that kept their position at top3 and never fluctuated, and a large list of keywords that keep going in and out, with some popping back to their original positions, and others all the way down to the top50-100 range.

Among the lost keywords, there are some that went in and out multiple times (but never stayed in their position longer than 24 hours) and some that never popped back since the 19th.

The strange part is that I notice a keyword that has never popped back before coming back every other day, but the bulk of the shifts in my tracked keywords come from the keywords that are always coming and going back and forth.

For instance, if I see 30 keywords gained in serprobot (and usually 30 keywords that are lost in such case), maybe 28-29 of them are ones that just keep going in and out regularly without any improvement, and 1-2 are keywords that have never popped back in before.

Update:

I completely lost all the keywords that aren't in the top10 within the past few days, and now the fluctuations stopped almost completely (from 80-100 keyword swings to 10-20 today).

What's funny is that whatever's left of my top3 keywords are practically untouched, which makes this the most nonsensical thing I've ever seen.

How can I be ranking top3 for some terms but not even top100 for others?
 
I got got by this in May on one of my sites and haven't seen a recovery.
  • About 100 posts, mostly informational
  • 99/100 on ahrefs site audit
  • All content written by me
  • Non-YMYL
  • EAT is a fake persona with an author box, some back story, a small about page. Not great but better than most of the competition.
  • Seasonal, currently in the middle of the off season
  • Same strategy and layout as three of my other sites that weren't hit but have better link profiles. It's almost impossible to build links in this niche other than buying from blogs that sell guest posts. Not worth my time to do more advanced stuff.
JP4ZU2D.png

In terms of what I actually see, for primary keywords, everything dropped out of the top 100 except for a couple of posts that stayed in position 1.

In July I built some links and this hasn't seemed to make a difference.

Not sure about the indexed/served argument. About 2/3 of pages are stuck under "Crawled - currently not indexed" in GSC, although a lot of "indexed" pages are still out of the top 100.

Not a freshness issue, the site is pretty new.

There is no rhyme/reason to the pages that are still position #1. Mostly product review terms, with decent volume. A few info as well. Historically some of my best pages. But other good performers are also out of the top 100.

I am seeing some blips back into the index, like this:
ef3GpjP.png

The only thing I can think to do is surf the site and mess around with words on page. Speed and all that shit is fine. I would kind of like the extra $1k-$2k this site gives me in peak season, which we're coming up to, but I'm pretty stuck on things to try with it.

I have long stopped publishing on this domain.

My question: has anyone seen this affect sites with good links? Like >15 >85DR news/gov/educational links (read: references to your brand and ideally your author, for news sites) that were built whitehat.
 
just saw this on reddit and wanted to share since the examples seem to be quite limited for this particular problem:

X1K4Boc.png


dRCF1Re.png


absolutely brutal stuff
 
I have long stopped publishing on this domain.
Why would Google want to re-index you then?

Not sure about the indexed/served argument. About 2/3 of pages are stuck under "Crawled - currently not indexed" in GSC, although a lot of "indexed" pages are still out of the top 100.
In GSC, I'm guessing your set to desktop crawler rather than mobile? Desktop crawler is fucked right now and if you're set to desktop crawler, you've got an uphill climb.

This is either a sitewide technical SEO issue, or it's a content issue. There's almost no other explainable reason. So, look for technical SEO issues and then take a handful of pages and change them very significantly to be more unique when compared to existing pages ranking for the intended keyword. Hit the page with social bookmarks, social shares and ask GSC to recrawl. These three are intended to just force Google to try recrawl and index the content again.
 
In GSC, I'm guessing your set to desktop crawler rather than mobile? Desktop crawler is fucked right now and if you're set to desktop crawler, you've got an uphill climb.

all my sites are set to desktop crawler, regardless of desktop/mobile traffic distribution

is there something in particular that influences this?
 
I found a literal "support group" on Facebook regarding this topic and someone posted this:

ANpX4BR.jpg

That's like a 90% drop I'd say.

The group sucks and is a tiny group and it's just people complaining and nobody but this person sharing a graph. It's a VERY widespread problem that people are reluctant to talk about as I think many fear it can reflect on their reputations.

This does look like a technical SEO problem to me too, largely one of indexation, BUT I've investigated dozens of these sites now and I've yet to run into that problem or anything obvious going on in robots.txt, etc. What I can say for sure is they all suffer very horrible on-site EEAT signals, or rather a lack thereof.
 
This does look like a technical SEO problem to me too, largely one of indexation, BUT I've investigated dozens of these sites now and I've yet to run into that problem or anything obvious going on in robots.txt, etc. What I can say for sure is they all suffer very horrible on-site EEAT signals, or rather a lack thereof.

a lack of EEAT signals indeed seems to be a common theme for the sites that got hit, including mine

but i can't quite accept it as the problem for 2 reasons:

1) should lack of EEAT signals put you from #1-3 to not being served at all? there are literal cloaked spam pages around #11-20

2) there are more sites that don't have EEAT signals than those that do, and you'll see a lot of them not having this problem.

plenty of sites that are quite similar to mine on that front came away unscathed, and in fact, most are even worse (not even an about page)
 
@exarch, I agree with you. It shouldn't cause a problem where pages aren't being served at all. I've said it many times. Everyone that hasn't had their chance in the rodeo since May of 2022 is quick to dismiss this issue and say "you must be spamming, read the webmaster guidelines." But something much larger is afoot. When I'm researching competitors in my vertical and niches, this problem is affecting at least 60% of the sites if not more.

Somewhere on the first page I talk about theories and one of them was "migration to dynamic machine learning algorithms" and how this could be rolling across "chunks of the index" and what better way to tackle that than industry by industry at a time. It doesn't explain anything, other than why I'm seeing my entire industry affected all at the same time (September 2022 core update). Before that it was May, and some unannounced update around June/July for other people.

It does seem that there's a common theme as we're saying (lack of EEAT), and if you look at what replaced them it's entirely about EEAT (brick and mortar stores with an online site with a blog on it). They have exactly what everyone that was hit was lacking.

My brain simply hasn't connected the dots yet, but I feel this is a piece of the puzzle. It seems obvious to me that something in these new algo updates is being extremely aggressive. But they have had plenty enough time to solve it and roll it back, which they haven't, so I'm assuming it's purposeful and "temporary" maybe, but even then temporary can mean years in internet time. The only reason I say temporary is that if it's purposeful and being allowed to continue then maybe it's something they feel confident whatever change is going on will shake itself out once the whole process is done. This is complete conjecture on my part.

But let's remember the change over to the mobile-first index. That started in March 2018 I believe and while it's mostly done it's still not complete.

All I know is that it's been a great kick up the rump to improve my custom theme and bake all these improvements in so every newer site starts in a better place. If this is something to be solved and gets solved, it feels like it's "go time" as there will be a window before a next giant disruption occurs (you'd think...)
 
I am working on a website that fits the pattern discussed above.

4ERh7oX.jpg


- it is the summer outdoors niche, so a decline in August and September is normal and November and December traffic is also OKish

- monetized with Amazon and Ezoic

- almost 300 articles are published. 30 best X listicles, 150 informational articles with products included, and 100ish informational articles either without aff links or 1-2 text links. 99% of articles are indexed and stay indexed.

- no purchased direct links. The only link building done is Google stack using Google documents, interlinked and inter-embedded with branded Web 2.0 sites and profiles and branded cloud pages (S3, Azure, Google Cloud). Some light link building (GSA contextual, Web 2.0 contextual, and cheap PBNs) to Google stack and other T1 properties. Some natural backlinks are acquired to the site over time. DR is a bit below 50.

- Around New Year Elementor was removed. Core vitals are now in greens starting from February (high 80ies and rising).

- After New Year EAT was introduced. I basically copied EAT setup from totalshape.com. I listed 2 real people on the staff page, with real images, and links to FB and LinkedIn profiles. One is a writer, another is a researcher (me). Except there is no phone number listed and the address is a fake PO box # in a real US Post office. I am hesitant to edit the address because I have put it on the Google Stack properties and branded Web2.0s and a few citations. And I don't believe G would penalize the website for the PO box number we are not in fact renting.

- There is a number of articles constantly ranking #1 organically (either #1 or #2 below Youtube or below the featured snippet). They do not move when G drops the site. These are both informational articles and best x listicles - no pattern

- If the article is not constantly ranking #1, it is fluctuating between page 1 and not served - when G demotes the site. But all articles remain indexed. I can find them using the site:, or "Article title", or articletitle+sitename.

- the demoted articles (at least some of them) occasionally come back to page 1. I have a few indicator keywords for articles that go up and down, and I check them a few times per day using brightlocal.com local SERP checker to avoid personalized results (or incognito mode on the phone) - I see them appearing on Page 1 a few times a day. I also see traffic coming to them. And then 10 minutes later they are gone. G testing user engagement signals?

- I don't believe there are technical issues. I scan the site weekly with Website Auditor and occasionally with the Ahrefs audit tool, and nothing exceptional shows up in the reports.

- last summer I found around 20ish outgoing links that looked like paid links. My JV partner insisted that that might be done by his former employee who might have earned a few bucks on the side. I deleted these links back in July or August.

- My only explanation is that G does not like the content. There is insufficient coverage of topics within articles and missing relevant entities. Content is written by cheap outsourced writers. The site is JV where the other party (owner of a content mill) is responsible for the content, and I could not enforce my content guidelines when the site was doing fine.

- We have published around 10 articles on the legal aspects of these outdoor activities and equipment in different US states. Something along the lines of 'Skateboarding Laws In Oregon' that covers questions like 'can you skateboard on the sidewalk in Oregon' and ''Oregon skateboarding helmet laws' Can these make the site look like a legal site without any EEAT? I still see competitors still ranking for these queries.

- Now we are revising the content to expand the informational content and rewrite product reviews according to G product review requirements.

- noticed another strange issue that is present when the site is demoted/penalized, like right now.

Let's say the search phrase is 'how to carry couch in suv'. The article title is 'how to carry a couch in a suv'

'how to carry a couch in a suv' - my article is ranking
'how to carry a couch in suv' - my article is NOT ranking
'how to carry couch in suv' - my article is ranking
'how to carry couch in a suv' - my article is NOT ranking

I get the same result repeatedly searching back to back, so this is not G showing random result once.

This does not make sense.

For some keywords, my article may rank when one synonym (SUV) is used but not when another (sport utility vehicle).
 
Last edited:
wdhtNWS.png

To give you guys some hope, this is a page that was down for a good 10 months before recovering and seeing similar patterns. I had 7-8 in total like that, got deindexed in an update, and recovered with an update as well. Albeit a year later.
 
all my sites are set to desktop crawler, regardless of desktop/mobile traffic distribution

is there something in particular that influences this?
As far as I can tell, this gets set when a site is first crawled / added to GSC and there's no obvious way to change it. I'd guess it's related to how mobile friendly your site is. So if you're not 100% perfect, I'd be working on that.

You can see the desktop crawler in your server logs, it comes and then nothing gets indexed. Meanwhile the mobile crawler comes, instant index and ranking. It's something very specific to the desktop crawler, perhaps even a bug on Google side, or the mobile index is the only primary index now so desktop crawler isn't translating 1:1 to the "true" index.
 
@JanisG I experience something similar with one project. I don't think is the content.

Let's say your content is not good enough for the first position, then it should be ranked lower,. Not out of serps. All the sites in the top 100 have better content than you? I doubt it.

Issues like that have been ongoing for some time, multiple users report it on reddit, webmaster forums, and so on. Does not seem to be a correlation.

One of my competitors in another niche got hit in the same way. I'm ashamed to say but he has a better site than my own. I know him personally and we chat from time to time. He has EAT/EEAT/ETC, better hosting, theme, content... His site is huge compared to mine. Got hit on the 24th, the same as I did. The two niches are not related at all. My site in that particular niche survived, at least for now. No changes for me.

No matter how I look at it, it does not make any kind of sense.
 
Let's say your content is not good enough for the first position, then it should be ranked lower,. Not out of serps. All the sites in the top 100 have better content than you? I doubt it.
This is an old way of thinking, when Google had the resources to index a higher percentage of existing webpages.

Google might crawl Page A and think it's perfect, ranking it #1. Then, it crawls Page B which it thinks is complete shit, but good enough to be indexed. Importantly, the content on Page B is meaningfully different to Page A, so it ranks it #52.

Then, it crawls your Page C, which it thinks is pretty great and would like to rank #4. But, Google can tell that the informational payload of Page C is virtually the same as Page A. You recommend the same products, you give the same reviews, you share the same suggestions when answering an information question, you use the same entities in the text. From the user perspective they get virtually the same thing from Page A and Page C, so there's no point indexing Page C. They'd actually rather land on Page B, despite how shit it is, because at least the content is saying something different and so they have a further opportunity for the user to potentially find what they want.

With the proliferation of AI content, this style of approach is becoming critical for Google because they do not have the resources to index and serve all of the webpages popping up. I don't know that this is precisely how it works, but this is likely how I'd build it if I was them, and it aligns fairly well with what I'm seeing. At some level, they have to choose which content to not index, or deindex, and you're fitting the mould.
 
This is an old way of thinking, when Google had the resources to index a higher percentage of existing webpages.

Google might crawl Page A and think it's perfect, ranking it #1. Then, it crawls Page B which it thinks is complete shit, but good enough to be indexed. Importantly, the content on Page B is meaningfully different to Page A, so it ranks it #52.

Then, it crawls your Page C, which it thinks is pretty great and would like to rank #4. But, Google can tell that the informational payload of Page C is virtually the same as Page A. You recommend the same products, you give the same reviews, you share the same suggestions when answering an information question, you use the same entities in the text. From the user perspective they get virtually the same thing from Page A and Page C, so there's no point indexing Page C. They'd actually rather land on Page B, despite how shit it is, because at least the content is saying something different and so they have a further opportunity for the user to potentially find what they want.

With the proliferation of AI content, this style of approach is becoming critical for Google because they do not have the resources to index and serve all of the webpages popping up. I don't know that this is precisely how it works, but this is likely how I'd build it if I was them, and it aligns fairly well with what I'm seeing. At some level, they have to choose which content to not index, or deindex, and you're fitting the mould.
Sorry, but I find it hard to believe a goddamn h1 with the exact match and no content on a sites.google.com is better content than my "shitty" article and deserves a place in the top 100. Or pages from 2017 on abandoned domains, keyword stuffing on hacked sites like it's the old days...

What you say would make sense if part of the results would not be filled with garbage. And, ok, maybe one, two, half of the articles are bs but all of them are? Not ranking even for a longtail in the top 100?

In my case, there are some possible scenarios that I'm currently investigating.

There is also the possibility of some small screwup from google. This thing happened several times now.
 
I am getting hit now (no review content). Was hit today. Down around 20% I'd guess. Zero review content.
 
Hey everyone. I'm new here; I found this forum via a reddit comment, and it looks pretty good compared to some of the other ones I used to frequent in the past. I wanted to write an introduction post.

I've been building websites since around 2002. I've had a wide range of successes and failures along the way. I've had six-figure affiliate sites using paid traffic to SEO-based content sites that make barely $100. I run the latter right now and as you can tell, after all these years I still don't really have an idea of what I am doing

I recently launched my first paid product. It's a freemium WordPress plugin, and I'm trying to market it. It's off to a very slow start.

The post that got me here was on May Google algorithm updates. I wanted to share on that thread, but I don't have privileges yet. This is what I saw on one of my sites.

usUBlIH.png

As you can tell, I got completely annihilated in may according to GSC, but I'm starting to see some slight recovery. I'll admit that some content may have been of low quality, but it was all unique and non-AI generated content. The other thing that I suspect is that E-A-T is playing a role, as I was not posting as a real human. On top of that, the theme was a very basic blog style theme. The site wasn't really branded at all.

Things I've attempted to do to correct this:
  1. Got a new professional looking theme and did a complete re-branding
  2. I'm the author now and I explain my expertise in the footer of each post and link to my about page
  3. I made sure to have pages like about, contact, terms and services, and privacy policy
  4. I audited the content and remove quite a few lower quality posts that I don't feel brought value to the site
  5. I went through and updated every single post and re-worked all of the content.
  6. Disavowed some crappy links I found
  7. Published some new, unique and pretty high quality content in my opinion
  8. Decreased page load time
  9. Decreased number of ads per page
  10. Added schema markup to all appropriate pages (aggregated reviews, FAQs, etc)
Now it's just a waiting game and continuing to publish content. Although, it's extremely hard to stay motivated given that I have almost no traffic now.

Anyway, Hi.
 
So I noticed that several blog posts related to bikes as bicycles are ranking for motorcycle-related keywords but not ranking for bicycle related keywords.

Yesterday I edited 5 of these articles and added About and Mentions article schema to shove in G face that these articles are about bicycles. Today these articles are back on page 1 for their main keywords. Coincidence?
 
Why would Google want to re-index you then?
That could have a correlation with ranking, albeit a small one, on a sitewide level. Wouldn't have anything to do with indexing.

In GSC, I'm guessing your set to desktop crawler rather than mobile? Desktop crawler is fucked right now and if you're set to desktop crawler, you've got an uphill climb.

This is either a sitewide technical SEO issue, or it's a content issue. There's almost no other explainable reason. So, look for technical SEO issues and then take a handful of pages and change them very significantly to be more unique when compared to existing pages ranking for the intended keyword. Hit the page with social bookmarks, social shares and ask GSC to recrawl. These three are intended to just force Google to try recrawl and index the content again.
Indexing crawler is Googlebot Smartphone, no it's not a sitewide technical SEO issue nor a content issue. It's an EEAT issue as Ryu says. More specifically, a lack of links. Other sites without links aren't being hit, but if you have them, you won't get hit by this. I'm talking actual links, not paid-for ones.

We're coming back a little bit, without having changed anything since my last post. Most pages came back into the index, including from 10-100. However after the latest core update 90% of these have once again dropped out according to my rank tracking for parent terms, although traffic is basically the same as the week pre-update. See impressions dropping but clicks staying the same.

C4VzvZa.png
 
Back