Recovery efforts after getting hit by last 2 updates

@Prentzz thank you for that post - I couldn't agree more that it's a case of just being too close to the edge of the cliff. It's the optimal place to be to see the waves but it just takes a little erosion to drop you off the edge.

That's the analogy I started using about 10 years ago with links customers and it's similar today. Exact match anchors are super powerful. They always have been. As recently as a year or so ago on here I saw people mention how powerful they still were - and lots of people really pushing going back to them again more than before Penguin even because of how well they were working etc...

I always used to say to customers who wanted to go all out like that - it's safer to be just a bit further inshore. Don't be one of the first sites to drop off that cliff - give yourself the chance to 'see the waves' for longer and give yourself more time to notice what's going on to others and adapt etc.

I've seen the same with content. A law firm SEO buddy of mine that we did links for updated one of his clients pages to be like an encyclopaedia page almost - multimedia, podcasts, example cases - and immediately after updating it lost tons of places. But over time all the people on the edge of the cliff with their 'accident law [town]', 'car crash in [town]' pages with all the bold, italic, and underline crap that were outranking his client eventually fell off the cliff as Google constantly adjusts, and his page moved up anyway after such a major change I think Google does randomise how it responds to it to make testing harder leading to a net gain for the client worth way more than the page rewrite.

Sure at the time you could have concluded - spam local law firm pages with bad links was the cheapest/easiest way to win - but those law firms had all built houses right at the cliffs edge.
 
@Prentzz 15% can mean 1 link in 10 with an exact match. You are not gonna send 1.000 links...

Mostly I agree with you.

But, if you check the data from Cora guys, keyword density is still a thing. And a lot of other factors that we try nowadays to ignore because google is "too smart".

On a particular niche, I have 2 sites that I use to gather data about changes. I rank them for the same keywords. Kept them clean, with few but nice links, and everything vanilla. Nothing that could give the impression of being overoptimized. 20ish posts, the same in structure, and so on.

They don't rank well at the same time. I hold the first position always with one of them but, the other one goes to 3-4, even 5. After an update, it's gonna pop as nr1, and the other goes down. Never did the other site rank #2 while the other ranking as #1.

The point I was trying to make was, sometimes sites are just "blessed". Most probably there are some internal rules in setting sites apart at a really low level but for mere mortals like me, they will seem blessed.
 
Hey idk if this is random or related since people mentioned keyword density. My posts a few of them had the main h1 keyword too many times.

I actually thought having keyword more times would help. Like in Conclusion: Keyword here. I'd stuff it like that a little bit and a few of my articles weren't appearing in the search. NOTE: they were still indexed cus they were high quality. Anyways so I lowered density and boom it's ranked in Google on page 2-4 not sure where.
 
On a particular niche, I have 2 sites that I use to gather data about changes. I rank them for the same keywords. Kept them clean, with few but nice links, and everything vanilla. Nothing that could give the impression of being overoptimized. 20ish posts, the same in structure, and so on.

The point I was trying to make was, sometimes sites are just "blessed". Most probably there are some internal rules in setting sites apart at a really low level but for mere mortals like me, they will seem blessed.

Interesting thought. I'd say it's more that Google uses randomness as a tiebreaker when they really can't tell which page is better (read more: stochastic tinkering or simulated annealing). Injecting randomness is a pretty damn good solution to when decision making is hard (read more: Buridan's donkey). I actually use this technique for lots of things, including trivial shit like which ice cream flavor to pick.

There is a smoking gun though in many, many cases.

If I had to pick one problem as the smoking gun, it would be index bloat caused by having tons of single-use tags. 75% of my index was tag pages. If I look at my main competitor's site (700k traffic), they use lots and lots of tags. The key difference is they noindex them.

In retrospect, if I had to only make one change, I would've noindexed the tags instead of deleting them. I'm only using categories now, but I'm using them correctly for grouping articles around topics.
 
@Prentzz

On a particular niche, I have 2 sites that I use to gather data about changes. I rank them for the same keywords. Kept them clean, with few but nice links, and everything vanilla. Nothing that could give the impression of being overoptimized. 20ish posts, the same in structure, and so on.
We're trying a similar method of ranking 2 sites with the same keyword. So far the original site is fluctuating in rankings and the second has a total of 25 out of 50+ pages indexed.

Is the second site you're running fairing well? Or do plan to continue with it?

Hey idk if this is random or related since people mentioned keyword density. My posts a few of them had the main h1 keyword too many times.

I actually thought having keyword more times would help. Like in Conclusion: Keyword here. I'd stuff it like that a little bit and a few of my articles weren't appearing in the search. NOTE: they were still indexed cus they were high quality. Anyways so I lowered density and boom it's ranked in Google on page 2-4 not sure where.
Causation does not equal correlation. But curious from what kw density you began at, and what it is now?

We simply use the primary keyword 5x in every article (not including headings) and call it good. So far no real issues.
 
We're trying a similar method of ranking 2 sites with the same keyword. So far the original site is fluctuating in rankings and the second has a total of 25 out of 50+ pages indexed.

Is the second site you're running fairing well? Or do plan to continue with it?
The second site is doing ok, not great but ok. For now, I'm not working on it. Maybe later on.

Noticed lots of fluctuations also. Some really wild.
 
We're trying a similar method of ranking 2 sites with the same keyword. So far the original site is fluctuating in rankings and the second has a total of 25 out of 50+ pages indexed.

Is the second site you're running fairing well? Or do plan to continue with it?


Causation does not equal correlation. But curious from what kw density you began at, and what it is now?

We simply use the primary keyword 5x in every article (not including headings) and call it good. So far no real issues.
I am not sure but so I had the review keyword like 3 times in the subheading. Like I'd do Conclusion: Product Review, and also have it 2 more times. So instead I just removed the h2's. And it worked. I mean there's no reason why I should have in the conclusion like that to stuff it.
 
Status: 10% monthly growth. This is the first time I've had growth in the last 6 months.

Next: This is good, but I'm not too excited. This could be a "dead cat bounce". Ignore short term results and keep grinding. I've been improving site structure, deleting crappy posts (got past that psychological barrier), and intensely fixing/optimizing posts**. In a small sampling of optimized posts, I can see significant improvement in results (higher CTR, better rank).

**Much more intensely. If I was going 25 mph before, I'm going 150 mph now. I fixed more posts in the last month than I did in the previous 5 months combined.
 
Here's a periodic update.

The March core update gave me a 25% bump for a few weeks before it dropped back down. It's pretty clear to me I'm still in algorithm jail.

I read in another post that it can take 8-14 months to recover. I'm on month 9 at this point and I've been plateaued at 80k all year. I just keep publishing and updating content.

It's been tough to keep going. Effort and results are not tied up, so sometimes it seems like lunacy to keep trying regardless of the results. But this reminds me of the very beginning: all effort, no results while being in the Google sandbox. As long as I know the strategy is good - quality x quantity - then it's just a matter of patience.
 
Here's a periodic update.

The March core update gave me a 25% bump for a few weeks before it dropped back down. It's pretty clear to me I'm still in algorithm jail.

I read in another post that it can take 8-14 months to recover. I'm on month 9 at this point and I've been plateaued at 80k all year. I just keep publishing and updating content.

It's been tough to keep going. Effort and results are not tied up, so sometimes it seems like lunacy to keep trying regardless of the results. But this reminds me of the very beginning: all effort, no results while being in the Google sandbox. As long as I know the strategy is good - quality x quantity - then it's just a matter of patience.

Yeah, it's a bit of a catch, if you don't work on it, then why should Google reward it, but if you work on it and Google doesn't reward it, then why should you work on it?

There's no good answer to this then, except I would consider if selling earlier make more sense. Sell on top, before the "inevitable" hit.
 
> Core Update drops
> Hello darkness my old friend

I got smacked by the core update. My competitors got smacked too, so I'm not too worried about it. Before this, I was actually starting to gain keywords and traffic after plateauing for the first 6 months of the year. So I think my strategy and process are fine.

One thing: I realized my internal linking is bad. I have 450 posts now. If I only count contextual content links*, then 75% of the pages are orphaned. I did a bunch of research and came to the conclusion that Link Whisper w/ my GSC data is a good way to catch up in this scenario. About 1/3 of its suggestions are usable (with tweaks). I'm going through posts manually and carefully adding highly relevant content links and also making sure they have 5 semi-relevant related posts at the end. This will take a long time to get through them all, but I think the quality is worth it (I want people to actually click the damn things, so relevancy is important).

*Not counting "related posts" links at the end of the content, navigation links, and category / archive links.
 
> Core Update drops
> Hello darkness my old friend

I got smacked by the core update. My competitors got smacked too, so I'm not too worried about it. Before this, I was actually starting to gain keywords and traffic after plateauing for the first 6 months of the year. So I think my strategy and process are fine.

One thing: I realized my internal linking is bad. I have 450 posts now. If I only count contextual content links*, then 75% of the pages are orphaned. I did a bunch of research and came to the conclusion that Link Whisper w/ my GSC data is a good way to catch up in this scenario. About 1/3 of its suggestions are usable (with tweaks). I'm going through posts manually and carefully adding highly relevant content links and also making sure they have 5 semi-relevant related posts at the end. This will take a long time to get through them all, but I think the quality is worth it (I want people to actually click the damn things, so relevancy is important).

*Not counting "related posts" links at the end of the content, navigation links, and category / archive links.
Link Whisper will only suggest pages that don't have links right?

Example: If you already have a link from Page A to Page B. It will not suggest page B, correct?

Asking because I am interested in the product as well... it wasn't so bad to do it manually when the site was smaller but it starts to get kind of ridiculous remembering all the pages that you have.
 
Link Whisper will only suggest pages that don't have links right?

Example: If you already have a link from Page A to Page B. It will not suggest page B, correct?

Asking because I am interested in the product as well... it wasn't so bad to do it manually when the site was smaller but it starts to get kind of ridiculous remembering all the pages that you have.
That's right. If A -> B already, it won't suggest B.

Also, by default, it doesn't count "related posts" links unless you toggle that setting on. I left it off. So in some cases, I may end up having a link in content and in the related posts section, but it's with different anchor text. I don't think that's a problem.
 
I was just starting to get a little organic traffic for the first time EVER (I'm a noob) but then the August update snuffed out that small fire. Now I have to wonder what I did right to get that small flicker of traffic and how to get it going again. Currently at zero.

That's why I'm here. Hi everyone.
 
Here's some code if you want to noindex tag pages with only 1 post allocated to them. Tag archives with 2 or more posts will be index.

Code:
add_filter( 'wp_robots', 'wpse_cleantags_add_noindex' );
function wpse_cleantags_add_noindex( $robots ) {
    global $wp_query;


    if ( is_tag() && $wp_query->found_posts < 2 ) {
        $robots['noindex'] = true;
        $robots['follow']  = true;
    }


    return $robots;
}
Thank you for this.. just implemented this snippet. I never thought about index bloat until now. I have 2000+ tags, many of them with 0 posts.

They are mainly typos and relics from my switch from Blogger to Wordpress back in 2018. Cut a bunch of posts at the time.

I like using tags simply for the related post functionality. Categories are broad, tags are more specific and can cross categories.

Think:
Category - Food - Recipes
Tag: Hot Dogs
 
Back