On-Page Optimization - Industry Has Lost It

RomesFall

‍‍‍
BuSo Pro
Joined
Oct 7, 2014
Messages
460
Likes
684
Degree
2
Last few years bro the SEO industry regarding on-page has turned to shit bigger shit, the only real bastion of truth I've found is BuilderSociety. I'd say in the last 18 months I've gone beyond learning through others, but when I do learn it's here.

What I rely on now is real data, real optimization.

Most of my opinions are contrarian, most people think I'm talking shit because it goes against the grain. I know the truth about what works and what doesn't... I know the misnomers because I spent the time learning. When John Doe tells me I'm talking shit, I can guarantee his opinion came from some gooruu. Why? Because they all talk shit.

I've had clients come to me, based on word of mouth for the results I get. Nothing else... Yet when I make recommendations, they lose their wig and freak out until I boost them up anywhere from 4-40 positions for their keywords. Using MY methods. Yet they still freak out over whatever for weeks as if I've planted a time bomb on their site. Came for results, got results & still not happy because of industry bullshit. This is why I rarely EVER take on client work since I escaped the agency trap.

TF-IDF - The industry thinks this is some kind of method to manipulate rankings. N-grams - Again same thing. LSI - same thing. Bullshit, bullshit, bullshit.

Long-form content? Don't get me started. Even taking SEO out of it, these absolute sheep think that 10,000 words is a good idea for usability and conversions? They don't even know how people browse apparently.

Guys don't understand why;
- best gaming keyboard for kids
and
- best gaming keyboard for hardcore gamers
can both rank in the same SERP. It's cannibalization apparently... Okay Sherlock. Apparently the intent is different, so they don't even understand what the fuck intent is. They don't understand basic things like buyer/customer/search persona.

Cannibalization is a term that they've never even come across in the dictionary because what they think it means... Damn.

The guruliminati would have you believe all kinds of things, not because they're deliberately spreading misinfo, but just because they couldn't give a shit. Do you think Larry Barry Edgar IV cares what ranks sites? No, he makes 80% of his income through suckers who slurp down that bottle of bilious ooze they call 'knowledge bombs'. He makes enough money that he can literally shotgun approach any project he goes in until some shit sticks, then he will tell you that LSI/TF-IDF/Word Volume makes him rank. K.

I'm hoping someone reads between the lines here lmao...

Okay, let me try and make some coherent sense.

> what works
> is good content
> based on good research
> continuing analysis
> constant optimization

Sure, I'm not talking about link building here. It's not actually my strong point, although I still think I'm ahead of most in understanding if not scale.

All it takes is one keyword to rank to make bank. Reality is, you can rank hundreds of keywords on page 1 with not a lot of content and under 50 RDs total. But not if what you're focusing on is fucking tfidf,lsi,wordcount,density, bullshieeeet.

Like I've said elsewhere... Correlation is not causation and most of these guys can't be trusted to even look up what cannibalization means, or find out how search engines actually work... How the hell do you expect them to know the difference between their own bias and the fact that they've triggered a bunch of factors with their shotgun approach long-form content and have no idea what they've actually done right or wrong. But it's LSI DUDE!

I have a couple thousand pages ranking for dozens of keywords across multiple sites where I'm hardly using more than 1,000 words. I've seen articles ranking for more keywords than they have words, not counting stop-words either.

As @Potatoe said to me recently;
"The ideal content length is the same as a dress, long enough to cover the essentials and short enough to be interesting".
^ Paraphrasing hard here, sorry haha

I could go on forever about the bullshit floating around about on-page or all the wise experts I've seen recently who literally repeat the same stuff I was putting on a certain other forum 5 years ago. Yes they are that far behind, why? They don't test, because most of that shit isn't what even works best in 2018 lol

I'm not even the best at the on-page stuff, not by a long shot, I'm not saying that. Nonetheless it doesn't mean what I'm saying here isn't 100% true.

So let's get focused and make this forum even more of a light in the darkness than it already is.
 
Last edited:
The thing that pisses me off is that a lot of these gurus don't back up their claims of expertise with any real evidence, apart from some blurry earnings screenshots which are easy to fake.

Take the last "Niche Site Project" at Niche Pursuits, who I used to follow avidly. The people the team picked from the applicants and took through their "proven" processes ended up with sites that did hardly anything or sod all.

Sure, not all sites are winners, but with all of the expertise and guru powers behind them, surely they might have more than a slim chance of success.
 
I don't know where that whole "must have more than 2,000 words in an article" line of thinking comes from , but have seen it all over the internet for the last year without any real evidence to back it up.
 
The Thin Content Argument

One of the chief arguments for long-form content is avoiding 'thin content'.

But as with almost anything, the industry has somehow fucked up the meaning of this as well... The powers of deduction are exemplary in this case. Thin is a synonym for slight, which is also another way of saying small. So thin content must be "small content".

Wronggg. There are a number of good reasons why thin content has nothing to do with word volume.

Thin content is an issue of content that is of low-quality, not matching the intent of the primary topic that Google or other SE's believe your page to be about. In fact a 500 word page can be perfect for satisfying the intent of many keywords and searchers. In the same way 2,000 words of filler garbage can do an amazing job of saying very little.

Whether a 500 word page or 2,000 word page is thin - the cause is the same. In almost all cases it's caused by a lack of organizational structure of your web page, and a lack of understanding of the key areas to address and consider.

You don't need to be slapped for thin content to have thin content and that's something I sense isn't understood.

Plan your content better, get better writers and always do some editorial work on each article at the very least and I promise you that you'll never have this problem (knowingly or unknowingly) no matter how many words your articles are.
 
I don't know if you have an agenda or a sales pitch coming up, but what you're saying sounds right to me.

> what works
> is good content
> based on good research
> continuing analysis
> constant optimization

I like it when I arrive at conclusions on my own and GURUS like you ;-) confirm it for me.

There's also definitely the lazy case of thinking more is better, yet Google is definitely smart enough to tell trash from cash at this point. Filler, fluffer, low level of contextual relevance, these are easy for Google to figure out.
 
There's also definitely the lazy case of thinking more is better, yet Google is definitely smart enough to tell trash from cash at this point. Filler, fluffer, low level of contextual relevance, these are easy for Google to figure out.

I think one of the main issues is that people don't quite comprehend how insanely good Google actually are at Information Retrieval. I think learning the basics of that is required study for anyone who wants to drive sustainable traffic via SEO.
 
I think one of the main issues is that people don't quite comprehend how insanely good Google actually are at Information Retrieval. I think learning the basics of that is required study for anyone who wants to drive sustainable traffic via SEO.

Yeah, I figured that out once Google began showing me pages without the keywords I was searching for, going purely on intent and content grouping.

As for studying this, what would you recommend?

I mean, there are multiple ways to go at this, one is good journalistic/scientific method, of how to write sourced, interactive, content and of course you could begin reading stuff about the actual technical aspect (which would be WAAAY too technical for most).
 
As for studying this, what would you recommend?

I mean, there are multiple ways to go at this, one is good journalistic/scientific method, of how to write sourced, interactive, content and of course you could begin reading stuff about the actual technical aspect (which would be WAAAY too technical for most).

Learning anything about Natural Language Processing is a good start. That helped fuel a lot of my own testing ideas for this year which has resulted in a bit of a breakthrough for me personally. A lot of that does actually focus around content writing and more importantly text analysis.

https://www.briggsby.com/on-page-seo-for-nlp < great beginner article

Prior to that I loved SEOByTheSea as he frequently links to various patents that are good to read. Of course none of those patents are necessarily in use, but it gives you some good ideas for testing. He does try to break it down to be less technical, which is good because as you say the technical side is difficult for most.
 
What does the keyword density look like for most of your articles?

The info in the link about subject clarity and pronoun usage point to an NLP-optimized article having a higher density - do you find this is the case?

I don't really use density, but when I do track it for research purposes it's often the case.

I'm a bigger fan of frequency which is the daddy of density if word volume is the mommy. Another issue with density is people will often forget to remove the stop words. So if you're gonna do it, definitely remember to do that.
 
I don't really use density, but when I do track it for research purposes it's often the case.

I'm a bigger fan of frequency which is the daddy of density if word volume is the mommy. Another issue with density is people will often forget to remove the stop words. So if you're gonna do it, definitely remember to do that.
What do you mean frequency? Aren't frequency and density correlated, especially if you mention word volume in the same sentence with frequency. Density is frequency/word volume.
 
The Briggsby link was super interesting, particularly the stuff about being explicit in your answers to questions. To me, this is just another step towards quality online content.
 
What do you mean frequency? Aren't frequency and density correlated, especially if you mention word volume in the same sentence with frequency. Density is frequency/word volume.

That's exactly my point.

Frequency is far more useful when actually optimizing for several reasons. Google may or may not use keyword density, if they do it's very unlikely it a large part of their core algo. There are far more accurate and interesting ways to calculate the importance of a term in a collection of text.

All of them (that I'm thinking of) rely on frequency, and yes word count. None of them rely on density though.

Density puts an unnecessary extra step between your analysis and optimization. I can tell you which of the two I've had more success with, but then that's already implied.
 
Creating content is like creating a food dish. It's not really suppose to be a mathematical formula, cause one dish might need more salt and another less; in this case "density" or whatever the gurus are touting as the "key factor" to Google's algorithm.

If you step back and think logically - none of it makes sense. Getting a keyword density of 2.9% or 5.7% or 21% - give me a break. It's like art, there are different styles for the times, there is no perfect "percentage" of red you should use for your painting - as well each industry has different requirements.

But back to the food analogy - Sometimes you want a quick bit to eat so you grab an apple, other times you want a 7 course meal cause you haven't eaten in days. That's how content consumption is also done. Anyone touting "you need 10K keywords" to rank is coming at from a scientific standpoint and not really thinking about the consumer or their desires - I doubt that content will even resonate with the user-base.

True marketers aren't sitting around with beakers trying to formulate the perfect word count or density, they aren't scientist. Scientist make horrible marketers - their content will come out bland and you'll see low user engagement and social proof as a result.

Great content provokes and inspires new and different ideas on its own. Fewer words can have a more lasting impact.

It's 2018/2019 - do people really think keyword density is really a part of Google's algorithm at this late date with all the technological advancements in the last 12+ years? CTR would be more useful as a measurement that the content was good versus keyword density. To put that time in perspective think about a parallel technological advancement: the iPhone didn't exist 12 years ago, it literally changed how people thought of smartphones - do you really think Google has been sitting on their ass that same whole time while a tool like the iPhone has come and changed the world - yet Google has done NOTHING in terms of search advancement?

Content in my mind = a great food dish for your audience. Or a piece of art - some might hate it, some might love it, but as long as it touches your core audience in a meaningful you'll get far more than simple SEO benefits from it.
 
@CCarter

"If I had more time, I would have written you a shorter letter." - Mark Twain

Who wants to read content that sprawls for 10,000+ words? How many even want 2,000? I can't tell you the last time I read more than 1,000 from beginning to end.

If some guys here don’t believe me get a load of these stats;

  • 81 Percent of Users only skim content they read online [1]
  • The average attention span for a piece of content is just 8 seconds [1]
  • 80% of people spend their time browsing versus doing in-depth reading. [1]

What CCarter says about the artistic element of content is so f'in true...

If the average SEO is still thinking about keyword density in 2018 it's like taking a magnifying glass to a lab where everyone else has a microscope.

The microscope wins every time, and that's what google is. The things they do are SO complex that we (and I mean nobody) can reverse engineer precisely how they're treating content. Trying to reverse engineer your competitors content has a benefit, but doing it via looking for averages is pretty much a waste of time.

The reason I recommend things like frequency over almost anything else is quite simple. It's easy to track and it's very easy to modify.

Once you have EVERYTHING else in place. The content itself has got to be damn good, entertaining to some extent because we're trying to fight against statistics like the ones I shared above. More importantly, the content needs to serve a purpose and offer value both ways. Entertainment again can be the core value (think BuzzFeed), but there's way more than a single way to offer value. [1]

Then you've got to deal with cognitive biases and a whole bunch of other shit... [1]

Needless to say, once you've done all of that, and done your absolute best job at creating killer content (which is something that evolves as you evolve as a marketer) you can think about your promotion.

At a later point, I'm personally inclined to look at content that is underperforming. Like CCarter says there are some easily available, very valuable metrics like CTR that you can look at. But other important things include Session Duration, Bounce Rate - and these are still available for free. Once you install something like Sumo or maybe something more sophisticated you can look at how people interact with your content. [1]

All of these metrics can point to underperforming content that needs to be improved. Just the same as your rankings. Frequency is a brilliant little thing that you can use on content that you're trying to improve in the SERPs because what boss man Carter says is true. Your piece of content is unique and you can increase or decrease frequency, monitor the SERPs (your position) and adjust as needed.

This kind of thing comes way later in the process, personally I'm looking at content I published 3 months ago minimum. If I've completely fucked it I remove the entire damn piece (focusing on what makes me money) or try and figure out what I did wrong and come back at it fresh if there's a lot of potential.

Word count, keyword density - all this stuff doesn't really matter and even my own recommendations on things like frequency are a time/place based thing that is also a very unimportant overall thing when it comes to running a business, let alone a site which is often just part of a bigger business.

Stick with your position of most opportunity - creating content that engages and spreads.

References:
1. Nothing I'm saying here is new, but to most pure 'SEOs' it is.
 
Density puts an unnecessary extra step between your analysis and optimization. I can tell you which of the two I've had more success with, but then that's already implied.

Yeah, but to me - you are looking at the same exact thing, just from a different angle. By changing keyword frequency, you are in turn changing keyword density. You are just not counting it in a percentage (%), but in numbers.

Frequency is a brilliant little thing that you can use on content that you're trying to improve in the SERPs because what boss man Carter says is true. Your piece of content is unique and you can increase or decrease frequency, monitor the SERPs (your position) and adjust as needed.

Again, the same thing. I feel you are kind of trying to be a contrarian, to me tf/idf, n-grams, lsi all have a place in SEO, especially if you are not an expert on a given topic. It helps you to find what search engines expect in a ranking web properties content, in terms of keywords and expansiveness of the topic that is covered. As someone who is hiring others to write a piece of content it can give you notes on what they should focus on, or atleast what parts of a given topic to include.

All of that is unnecessary, if you know the topic and provide a meaningful, well researched content piece. To me it comes down to matching user intent and making sure your page completes searchers desired task. Also, I might be completely wrong, you are far beyond me in terms of SEO - just my 2 cents on the topic.

If you step back and think logically - none of it makes sense. Getting a keyword density of 2.9% or 5.7% or 21% - give me a break. It's like art, there are different styles for the times, there is no perfect "percentage" of red you should use for your painting - as well each industry has different requirements.

I agree 100%.

Content in my mind = a great food dish for your audience. Or a piece of art - some might hate it, some might love it, but as long as it touches your core audience in a meaningful you'll get far more than simple SEO benefits from it.

At the end of the day, regardless if some think these are useless metrics, most of it is IMHO. We can all agree that serving user needs is what is most important.
 
Yeah, but to me - you are looking at the same exact thing, just from a different angle. By changing keyword frequency, you are in turn changing keyword density. You are just not counting it in a percentage (%), but in numbers.

It's not the same thing, yes frequency is part of density but that's like saying conversion rate is the same as visitors. It's not. They are two totally different metrics.

If you can't see why keyword frequency is better than density, in my opinion it's because you're not using either of them for the right things.


Again, the same thing. I feel you are kind of trying to be a contrarian, to me tf/idf, n-grams, lsi all have a place in SEO, especially if you are not an expert on a given topic. It helps you to find what search engines expect in a ranking web properties content, in terms of keywords and expansiveness of the topic that is covered. As someone who is hiring others to write a piece of content it can give you notes on what they should focus on, or atleast what parts of a given topic to include.

In this case I think you're one of the people I'm talking about in this post who think tf-idf and lsi are something that they aren't. Sure they have a place if you believe they're something they're not, but what you're actually talking about is something else entirely.

This is what I hate about the common SEO mindset. Everyone has an opinion based off of what they've read from some other SEO. If you actually bought a book or read some research papers you'd know how very wrong the majority are. Of course, you're not going to do that you'll probably just keep telling me how contrarian I am.


Yqah98i.png


I spent several hours analyzing a friends website not too long ago, and in the end, I made 6 total changes to his site to achieve these results for him, and yes this is just a small selection.

> I'm not posting this to be a braggy shit, it's really not even a competitive serp, my point here is that when was the last time you used keyword density,lsi,tf-idf,n-grams to find 6 very specific changes you could make that actually resulted in these kinds of changes? If anyone else was doing this kind of stuff, you'd hear about it within 0.00001 seconds it takes someone to post a 'rank boner'.

I've been told my views are contrarian for years, but somehow I just keep on ranking... Guess I'm just completely wrong and I should start caring about the latest hype-terms and trust people to understand what they actually mean vs use my own brain.

I'll just throw a link out there from SearchEngineJournal which is probably the only sensible article about LSI out there.
 
Last edited:
when was the last time you used keyword density,lsi,tf-idf,n-grams to find 6 very specific changes you could make that actually resulted in these kinds of changes?

I've used LSI to lower my frequency, which in turn lowered my density, dozens of times to increase rank in competitive serps. You're saying some things don't work that do and that's based upon my experience and not what I've read on SEO blogs.

You've made some great points but your way of doing SEO isn't necessarily the best way any more than mine is. Technical SEO is quite complicated and every site is different.

SEO is very forgiving in that if you do a few things really really well you can do just an ok job at others and still succeed. I've been following this thread with an open mind to see what you're doing that I can do better. You may find it beneficial to listen to @matora97, especially when he talks about using TFIDF when you're doing the initial research for a piece of content.

This was a great read btw: https://www.briggsby.com/on-page-seo-for-nlp

Finding things like this is exactly why I read this with an open mind. Much of it was too technical for me but I kept the below quote from the article in mind as I read it a second time, and that helped me focus on what's important.

"The key to well-optimized content for NLP is simple sentence structure, especially when answering questions. The advice we give our clients is to think about the 1-2 sentence answer you’d expect Google Assistant to provide when asking it a question."
 
You say TF*IDF, LSI, n-gram matching, etc., don't work, but then recommend that Briggsby article that says they work in the very first paragraph:
Traditional on-page SEO guidance is to target a primary phrase, its near-related terms, and its longtail variants by using them in the text and placing them in strategic locations on the page (i.e., title, headings, early in content, throughout content). However, writing for Natural Language Processing, or NLP, requires some additional steps and considerations.

It's not going into any depth because it's the introductory paragraph, but it clearly says you can't have NLP without the things you call bullshit. The key here is "requires some additional steps."

You said somewhere in the thread that density is out and frequency is in. I agree entirely and have been preaching that for years now. I'm pretty sure it's in the On-Page day in the Crash Course to some degree. I'd like to add some details about density here. Density is used by Google. It's used to see who is a spammer over-optimizing their content. But in the traditional sense of it being used as a ranking factor, it was usurped by TF*IDF. That is how they now measure what terms are important. The first part literally means "term frequency" which in one breath you say is bullshit but in the next breath say is how Google measures what an article is about. And then you tell the rest of us that we're the ones who don't understand it.

LSI, on the other hand, is how you keep your density down. If you know about a topic and write naturally, this will occur on its own. But there are ways to find out what "entities" Google considers related to a topic. Using them is how you clarify what the topic is and show Google that you're covering it in breadth and depth. It's most certainly not bullshit. NLP cannot exist without Latent Semantic Indexing, period. Google can't classify your content into one of these categories without LSI. It has to use the surrounding content to understand the context, otherwise when I type 'horse' they won't know if I'm talking about animals, woodworking, or illicit substances.

Saying N-Gram matching is bullshit is just silly. Every one of us has optimized for long-tail keywords. That can only happen if bigrams, trigrams, etc, are being parsed.

The problem with writing in a way that NLP can understand is we're right back to writing for robots. You literally suck all of the life, humanity, and art out of a sentence in order to make sure a robot can get it. That will have repercussions on your dwell time, bounce rate, pogo-sticking rate, conversions, etc. My assumption based on what I see happening in the SERPs is that Google's NLP powers are far past anything seen in the Briggsby article, which certainly covers the basics very well.

Try searching stuff like "that movie about the guy in the bunny suit" and watch what happens, and then tell me Google isn't using LSI, n-grams, TF*IDF, etc, enhanced by user metrics. There's no other way to explain it.
 
It's not the same thing, yes frequency is part of density but that's like saying conversion rate is the same as visitors. It's not. They are two totally different metrics.

I think its more like saying conversion rate is the same as tracking number of people who completed a given task and then saying conversion rate is a non-factor, you should move to counting number of conversions. But at the end of the day it all has its place and to say it is bullshit is wrong. Don't wan't it to seem like I was attacking you, as I do think you are very knowledgeable, but you are just wrong on this one.

You've made some great points but your way of doing SEO isn't necessarily the best way any more than mine is. Technical SEO is quite complicated and every site is different.

You said it best, that's essentially what I was getting at. Not the best at explaining my position. It's easy to suspect some things and find yourself reading articles that support your beliefs.

Try searching stuff like "that movie about the guy in the bunny suit" and watch what happens, and then tell me Google isn't using LSI, n-grams, TF*IDF, etc, enhanced by user metrics. There's no other way to explain it.

Couldn't have said it better.
 
@Calamari @Ryuzaki

Not discounting the fact you guys have had results with modifying your content, I'm just saying you're attributing the wrong term to what it is that you've actually done.

TF-IDF:

The way this is commonly preached is you can rank better by increasing your TF-IDF score. Which can simply be artificially increased by increasing your Term Frequency with the majority of keywords.

I personally have a big issue with this, being that you stick the Term in too many times and you're keyword stuffing. Keyword Density as you said Ryu is often easy to correlate with ranking drops when it goes too high. Same thing with TF-IDF.

So artificially increasing the score - what are you actually doing here? All you personally have control over is frequency. Why add the extra steps. Same thing with density.

Another inherent issue with TF-IDF is you need to know the exact IDF. We can all look up statistics, but that's changing daily.

TF-IDF is useful for Information Retrieval, it doesn't mean Google use it. Nobody on this forum can prove that. I personally think it's not very useful for SEOs. You just can't gauge the term weight accurately with this alone. Too many interdependent factors imo.

So as with any metric if you can't count what's important, you make what you can count important.

I genuinely think you're wasting your time with things like TF-IDF because you can't possibly hope to use the score in a massively meaningful way. But it boils down to making use of what you can actually make use of. So maybe it's like TF-IDF-LITE.

Someone saying they have got good results from TF-IDF is either in the wrong business and should go talk to DuckDuckGo about how they can get as good as Google, or they're simply just optimizing frequency and giving it a score based on their own personal IDF estimate. Extra steps that aren't worthwhile imo if you can achieve the same result by just counting terms.

LSI:

Google have in the past, and probably still use Latent Semantic Indexing to some extent.

LSI is old as hell and I want to reference what @CCarter said about people thinking Google are sitting on their backside. There are many methods more advanced and accurate than LSI.

RankBrain was a big step away from LSI style indexing. Check this out. What this kind of approach to indexing does is simply beyond the scope of what LSI can do alone.

I think the way most people use LSI is a misnomer. Kidding ourselves by saying we can utilize LSI by adding synonyms is a massive over-simplification. What people attribute to LSI is far more akin to Phrase Based Indexing, but that still doesn't quite cut it. There's a lot more going on underneath the hood with synonyms and I'm not arguing that adding synonyms in your content is good. What I'm saying is that it isn't "LSI". Synonyms have their place in LSI just like they do with any kind of Information Retrieval system.

N-Grams:

N-Grams such as Bigrams, Trigrams, 4-grams etc are more practical than the above two areas I'm talking about.

My issue with this is that I simply think it's largely a bit of a waste of time to make a big deal about it. It takes seconds to check and wouldn't be part of any serious optimization campaign I would personally do.

Of course, it can get results because again it's based on words and basically as we've all agreed adjusting content will have an impact on your rankings.

TL;DR

I think the issue here is that people are over-simplifying things like TF-IDF, LSI and not realizing that the way this works is too complex and interdependent with other forms of IR & Indexing for any mortal to be able to really utilize.

I think while people continue to attribute really simple things like modifying content to complex forms indexing or information retrieval that people will continue to miss the key point.

While we continue to say that optimizing synonym usage is LSI we completely miss that LSI (the true definition) is very likely to have been replaced completely or mostly.
 
Last edited:
Don't wan't it to seem like I was attacking you, as I do think you are very knowledgeable, but you are just wrong on this one.

As far as I'm concerned it's all just a debate between professionals so no worries there.

If you guys can prove me wrong I welcome it. I know I'm stubborn, but I genuinely think that a lot of these buzzwords are just too complex to be able to truly create accurate, meaningful metrics for ourselves.

That and the fact that I'm 99% sure that people mislabel these terms. I've seen no evidence to the contrary, just people saying they've used *insert method here* to increase rankings.

I could create a term, talk to a few friends and secure some guest posts. Get some industry buzz and the next thing you know the entire village will believe it's real because it was getting them results. Ultimately it would still boil down to the core concepts that we can personally manipulate.

We can't manipulate the way google handles the data, so all we can do is try to create better content and stay away from over-complicating things with fancy words, extra steps and such.

That's basically where I'm coming from and like I said if you guys can disprove what I'm saying here based off of more than personal experience I'd love that because I genuinely enjoy this stuff. I just get worried that this topic is going to descend into a thread of cognitive biases, heuristics and all that.

I'm not here to screw people over, if I'm saying something I'm saying it for a reason. I wouldn't talk about the topic unless I was sure, because believe me there's a lot of other stuff I've never mentioned anywhere that I think is promising - but I can't prove it.

I've tested all this other shit as a just in case scenario and I always get better results when I focus on what I can actually count in a meaningful way. I've done a lot of writing elsewhere about link sculpting and utilizing the same base philosophy to get good results from that. In order to do that it's always better to go back to the core fundamentals.
 
I think the issue here is that people are over-simplifying things like TF-IDF, LSI and not realizing that the way this works is too complex and interdependent with other forms of IR & Indexing for any mortal to be able to really utilize.

That's exactly right. I simplify my process as much as possible. I'm telling you what's working for me in my process of creating a single page that ranks for hundreds or even thousands of terms according to ahrefs data, and typically the more phrases I rank for according to that indicator the more organic traffic I get.

You think I'm over simplifying and I think you're over complicating.
 
You think I'm over simplifying and I think you're over complicating.

I've been pretty consistent about recommending simplifying optimization processes down to adjusting term frequency. I'm not sure how that's over-complicating it.

I'm not commenting on your processes btw, just saying that I think people over-simplify the methods they're talking about.
 
Great thread. I have nothing else to add, but I agree with the majority of what has been said by RomesFall.

People spending way too much time trying to please Google.

Google spending even more time trying to show results that please people.

Stop focusing on word count, density, etc, and start focusing on providing value to your readers and Google will reward you with juicy rankings.
 
Back