Massive influx of links

Thanks Carter, wise words. We have practically nothing to lose in comparison to the lifechanging revenue we stand to gain. Gloves are off.

Follow up shots will start with an aggressive content cull. Trimming much more fat in terms of long tail / deadweight articles that could give Google an excuse to keep up the HCU penalty.

Once the content's thinned out, an aggressive internal link cull. Gutting exact and strong partial match anchors. Even if it means re-building our internal links from the ground-up.

Finally, restructuring the site to push our local business signals as much as possible, reducing the spotlight on our SEO articles.

Then forcing G to take notice via crawl/index tools and, if needed, building shit links to the links I've disavowed.

Will keep you guys updated.
 
if needed, building shit links to the links I've disavowed.

Oh no, it's not an option. If you only do steps 1 through 5, but not 6, 7, and 8, then you'll fail. You HAVE to force Google to reindex the links otherwise you'll wait 6 months for a core update. Google isn't going to go back to some link from a site with no traffic that looks like hxxps://search.fyi.xyz/keywords/2019/long-city-name-k.html anytime soon.

Half-assing it is setting yourself up for failure and self-sabotage. You need to do the whole thing. Don't complain if you half-ass it and don't get results.
 
Do I pull the trigger? Any sanity checks worth doing first?

Quick answer:

I did exactly what you outlined above and while I have not yet recovered, I do not appear to be losing any more ground (for now).

Full answer:

I followed the exact steps that @Grind outlined using LRT and the crawlers. I'm following this up with further manual checks and disavows from spam links triggering 403/404/fake redirects in GSC. I'm also doing a COMPLETE overhaul of internal anchors on a site with 1500 articles.

Call me paranoid, but I'm pushing to get all this done before the ominous May update that Google says is going to target parasite SEO.

I have no idea if this strategy will get me back to where I was - but I'm going to do everything I can to stack the cards in my favor - and like @CCarter said (and the same goes for me)...

You have nothing to lose as far I as I can see.

Follow up shots will start with an aggressive content cull. Trimming much more fat in terms of long tail / deadweight articles that could give Google an excuse to keep up the HCU penalty.
I'm planning to do the same. But I'm not doing this until I complete Grind's recovery strategy + fix all the internal links. Then after I see where I stand I'll start culling deadweight where necessary.

I think the approach/sequence I'm taking is best because if Grind is right (which I believe he is), then theoretically the longtail and deadweight articles aren't the actual problem... and therefore should not be the priority... the spam links and internal anchors should be.

Too many messages coming in before I hit publish... sorry Moderators.

Finally, restructuring the site to push our local business signals as much as possible, reducing the spotlight on our SEO articles.
Would love to see a thread (or just a tactical outline) about how you're going to accomplish this. It's still something I'm thinking about and have not completely decided how to approach it.
 
How to built links to the spam? Is it simply going to fiverr and buy dofollow backlinks or is there more to it?
 
How to built links to the spam? Is it simply going to fiverr and buy dofollow backlinks or is there more to it?
You either pay someone to spam the spam links with cheap links, or you learn how to do it yourself, including how to use various software, scrape lists, rent proxies, generate spintax, etc. Link indexers do not work and will waste your time and money. Cheapo spam is the solution. Open-reg articles or dofollow high PR blog comments are the best bet, in my opinion. It can be trash because you're just hitting trash to get it recrawled.
 
You either pay someone to spam the spam links with cheap links, or you learn how to do it yourself, including how to use various software, scrape lists, rent proxies, generate spintax, etc. Link indexers do not work and will waste your time and money. Cheapo spam is the solution. Open-reg articles or dofollow high PR blog comments are the best bet, in my opinion. It can be trash because you're just hitting trash to get it recrawled.
Thanks! I will just buy the links :smile: What is you opinion on links now Ryuzaki? In the past I believed you leaned to 'Google can figure it out'. How is that now? You also believe in disavowing links just like CCarter and Grind?
 
Thanks! I will just buy the links :smile: What is you opinion on links now Ryuzaki? In the past I believed you leaned to 'Google can figure it out'. How is that now? You also believe in disavowing links just like CCarter and Grind?
I believe Google can figure out the very common spam that everyone gets hit with. If you get something unique going on, disavow can help. And it’ll never hurt to disavow trash. Just be careful you don’t grab things Google is giving you a positive boost over. The risk is that Google thinks some of this trash is a positive.

My opinion is that Google is once again in a phase where, in general, links are either neutral or positive, but rarely toxic. The only spam I’ve ever seen be toxic in YEARS is stuff that looks like you built out a PBN. Automated Web 2.0 stuff like Blogspot comes to mind there.
 
I'm planning to do the same. But I'm not doing this until I complete Grind's recovery strategy + fix all the internal links. Then after I see where I stand I'll start culling deadweight where necessary.

I think the approach/sequence I'm taking is best because if Grind is right (which I believe he is), then theoretically the longtail and deadweight articles aren't the actual problem... and therefore should not be the priority... the spam links and internal anchors should be.
Good point. I think we're overdue a cull anyway so going to do that to save some internal link work. Think both ways around makes sense.

Out of curiosity, how are you going through internal links? Best way I've found is using the same tool (Link Whisper) that gives you a list of internals pointing to a post. Can select links one-by-one by anchor text and delete all there. Will still take forever but can't see a better way. I can export and mark up exact matches quickly in a spreadsheet but can't find a way to bulk-delete

Would love to see a thread (or just a tactical outline) about how you're going to accomplish this. It's still something I'm thinking about and have not completely decided how to approach it.
Will do - basing it on what Joaky said in the Algo Update thread. Gut feel is to copy whatever local biz competitors that thrived in the HCU have done. Probably even make service area pages etc - whatever makes us seem more like a local business to a machine learning algo

I followed the exact steps that @Grind outlined using LRT and the crawlers. I'm following this up with further manual checks and disavows from spam links triggering 403/404/fake redirects in GSC. I'm also doing a COMPLETE overhaul of internal anchors on a site with 1500 articles.
FYI make sure to check Semrush -> Backlink Analysis too. Just went in after seeing Grind's latest video and found 7,500 new links I hadn't tackled before... another day of disavows it is
 
how are you going through internal links?
I'm manually going through each of the 1500+ articles. FML. I know there are tools out there but here's my reasoning...

I didn't use any tools for interlinking, everything was manual and done by an editor during the publishing process following a strict A><B><C><D><E silo structure pointing up to a parent/money page. As far as I can see, my only failure was using exact matches throughout.

I'm maintaining the same structure with the same target pages, only changing the anchors. Priority one is to make them as natural/conversational as possible. But I'm also positioning the new links higher on the page to show Google I want them prioritized... real fugazi fugazi shit.

So I'm using this opportunity to:

1. de-optimize (or is it optimize) anchors by removing exact match
2. re-position links in order of priority
3. re-position links higher on the page to demonstrate priority
4. run a site-wide quality control check of links and content
5. when I'm done, I'll use a tool to check for duplicates and make adjustments as needed

I'm busting through at a decent pace and just running down the list. Thank god I'm organized and have an incredibly well-documented content plan/process. Makes life a lot easier. Right now I'm about 10% of the way through. It's really not that bad. I told my wife I'll see her in May.

basing it on what Joaky said in the Algo Update thread. Gut feel is to copy whatever local biz competitors that thrived in the HCU have done.
Looking forward to this. And I think this is the right approach too. I'm keen to do this but I'm going to set this aside until YouTube is humming along, then I'll go back in and see how I can improve site structure from a "business" perspective.

make sure to check Semrush
Thanks, I'm digging into SEM now - especially image backlinks - I did pull SEM backlinks into LRT when initially running scans but I'm still finding links that LRT didn't pick up. I'm also spending time in GSC, though my inspection quota limits the amount of spam links I can find every day.
 
Awesome thread.

I am following Grind's advice on multiple client sites.

I have found that Opphive is a great tool for speeding up the process and is affordable.

I use Ahrefs + SEM + GSC for link data.

Then use OPP filters to whitelist or disavow the obvious good/bad stuff.

Then, manually tidy up the rest.

And you guys should follow Grind on Twitter.

Cheers

Tim
 
Awesome thread.

I am following Grind's advice on multiple client sites.

I have found that Opphive is a great tool for speeding up the process and is affordable.

I use Ahrefs + SEM + GSC for link data.

Then use OPP filters to whitelist or disavow the obvious good/bad stuff.

Then, manually tidy up the rest.

And you guys should follow Grind on Twitter.

Cheers

Tim
Welcome to the party!

How long ago did you disavow? Any results so far?
 
I haven't seen hard proof, but I trust Grind and Paul from Opphive not to talk shit.

Grind says he has had multiple recoveries, and I believe him.

But obviously, there are no guarantees, and Im not expecting miracles, its just another piece of the puzzle IMO.
 
Im not expecting miracles, its just another piece of the puzzle
This is where I'm at as well. One piece of many.

As I said here, I'm all in on this strategy. But, my rational brain is still skeptical, especially when OGs like @Ryuzaki clearly explain why charts used to support this strategy are not related/correlated - not to mention when there is a clear divide among some of the most respected voices on BuSo.

There are a few guys, myself included, openly chasing down this strategy - which probably means there are even more guys pursuing it behind the scenes. It will be interesting to see the kind of results, good or bad, we're all able to pull out of this in the end - hopefully, guys are willing to share what happens either way.
 
I haven't seen hard proof, but I trust Grind and Paul from Opphive not to talk shit.

Grind says he has had multiple recoveries, and I believe him.

But obviously, there are no guarantees, and Im not expecting miracles, its just another piece of the puzzle IMO.
I'll ask some guys for permission to share some screenshots without divulging identifying details.

None of these guys doing the work and recovering will ever believe Google again...about anything...and they are very reluctant to make themselves targets in any way.

Google has a vested interest in this never being about negative SEO, intentional or not.
 
2814a2f8-f553-4097-973a-60ac8b4c3f30.jpeg
b90bc5d5-11ee-422b-bc33-ad53f0322f3b.jpeg
419920d1-4039-4936-a0c3-b42fa7382694.jpeg

So to be clear, he just did the disavow and re-crawl process and still saw results in two weeks.

Hadn't even addressed the over-optimized internal anchor links, which the first HCU in December 2022 started targeting and we can only assume (based on sites that got an initial traffic ding there, then bigger in Aug/Sept 2024 with often a final death punch in March Double Update Hell) that subsequent versions have tightened the screws on further.

Something something horse to water something something...good luck bros.

Edit: Got a follow up just now. This is probably my last post on this topic going forward, it's fucking tiresome tbh. You have all the tools. Test it and see.

eeb32fc0-7e0e-41e7-8b34-42ff246c32b9.jpeg
 
Last edited:
Another here with around 50% more links in SEMrush than LRT.

You've also sold me and am currently underway with the process.

Fuck it, like you've all said multiple times, what is there to lose!

Will report back!
 
LRT had me pegged at ~80k spam links... but they refused to give me the URLs and said all of the toxic domains were included in the 7.5k they sampled. So, I ended up with roughly...

7500 unique from LRT
3500 unique from SEM (images mainly)
1000 unique from GSC "Not found (404)"
1000 unique from GSC "Page with redirect"
Purchasing ahrefs to see what they have for me next

Lots of duplicates mixed in across the sources but sitting on roughly 13k unique URLs to disavow. I'm also disavowing the majority of the domains just in case.

All the URLs/domains I'm finding are complete trash. Many of the sites are offline or redirect to spam sites that Chrome refuses to connect to. Lots of "/1000" and hotlinking.
 
Oh no, it's not an option. If you only do steps 1 through 5, but not 6, 7, and 8, then you'll fail.
Just to confirm, there is not 8 steps right?

You basically gather all toxic links you can find, disavow, wait 72 hours, then reindex those same disvowed links?
 
"1000 unique from GSC "Not found (404)"
"1000 unique from GSC "Page with redirect"

Those would be internal links, not external?
 
ather all toxic links you can find, disavow, wait 72 hours, then reindex those same disvowed links
As far as I know, that's it.

I wanted to be extra safe so I waited 96 hours to start recrawling the disavowed links.

If you're using LRT, I'd suggest going through and manually reviewing the links. I was surprised to see a few "good links" slip into their toxic list and a few "shit links" in the clean list. So if you can find time, it's probably worth doing.

Those would be internal links, not external?
You would think... but under "Pages" in GSC, I noticed I had a massive increase in the number of pages being listed under "Page with redirect" and "Not Found (404)" ... thousands.

So, I started inspecting them manually in GSC. Sure enough, thousands of shit websites directing Google to pages on my site that don't exist.

It looks like someone scrapped multiple other sites (in a similar niche to mine) and then replaced the sld with my site. Then they just sprayed my site with links (to these non-existent pages) from around the internet.... like I shared earlier in this thread most of the domains triggering 404s and redirects are complete gibberish and use .pl .de .it tlds.

I've found a lot of people (outside BuSo) complaining about the "/1000" links appearing in their GSC. The answer I see everyone sharing is the standard "No need to do anything, Google knows to ignore spam".

Since I'm following Grind's playbook to the letter... I'm digging into all of these sources and trying to get every link I can find... disavow at the URL (and domain) level... and then get every link recrawled over and over again.

And it might be worth mentioning that this is only 1 of 10 tactics I'm currently pursuing semi-simultaneously, though I'm mainly focused on tactics 1 to 4 right now while background work is happening on a few of the others.
 
None of these guys doing the work and recovering will ever believe Google again...about anything

For years Google have said links aren't important and they only need a few to rank a website. Well even the Googlers are re-trenching on this:


Think about all those SEO gurus that were running around saying "links aren't important" and "Google knows how to label this and that" and "don't use the disavow". One of the main sources was this Gary Googler guy... and now years later he said "I shouldn't have said that... I definitely shouldn't have said that".

How many people bought what Gary said and now can't figure out why they can't rank? Not even directly from Gary, but from the SEO gurus and XSEOs that ran around regurgitating that links don't matter claim? For YEARS.

The whole algorithm is still based off of links, anchor, and on-page content.

I listen to people in the trenches not the propaganda being spewed.

Just to confirm, there is not 8 steps right?

You basically gather all toxic links you can find, disavow, wait 72 hours, then reindex those same disvowed links?

You do it 3 times. Submit the disavow list - wait 72 hours, the re-index/re-crawl the disavow links. Wait another 72 hours, re-index/re-crawl the disavow links. Then wait another 72 hours re-index/re-crawl the disavow links. You should be 10-14 day in and start seeing some recovery - if not then you go down the GSA SER / scraperbox method.

In the meantime you are fixing your internal exact anchors - I suggest first starting with your pages that were hit the the most and work your way to lesser impacted pages. With @Grind's permission I'm writing a quick guide on the exact methods, so it's not just pieces here and there that have to be puzzled together.
 
Sorry for me being noob.

I now have a list of URL's to disavow. However, I use the domain property in GSC (domain.com). I have another property (https://www.domain.com) but I dont see any traffic on that property in GSC.

Should I disavow on this property?
And do I also need to disavow on http://www.domain.com and https://domain.com?
 
Last edited by a moderator:
You would think... but under "Pages" in GSC, I noticed I had a massive increase in the number of pages being listed under "Page with redirect" and "Not Found (404)" ... thousands.

So, I started inspecting them manually in GSC. Sure enough, thousands of shit websites directing Google to pages on my site that don't exist.

It looks like someone scrapped multiple other sites (in a similar niche to mine) and then replaced the sld with my site. Then they just sprayed my site with links (to these non-existent pages) from around the internet.... like I shared earlier in this thread most of the domains triggering 404s and redirects are complete gibberish and use .pl .de .it tlds.

I've found a lot of people (outside BuSo) complaining about the "/1000" links appearing in their GSC. The answer I see everyone sharing is the standard "No need to do anything, Google knows to ignore spam".

Since I'm following Grind's playbook to the letter... I'm digging into all of these sources and trying to get every link I can find... disavow at the URL (and domain) level... and then get every link recrawled over and over again.
Im with you on all of this so far... I have a site with exactly the same issues.

But what are you disavowing in this case?

The external links that point to broken URLs on your site?

If so where are you grabbing them from?
 
Back