What's your take on Google's Search Algorithm right now?

Joined
Feb 14, 2015
Messages
49
Likes
20
Degree
0
It seems like links mean very little these days and even hurt you in some cases. Anchor texts seem to mean very little. Age is meaning little. Even domain authority is getting usurped by freshness. This could just be my vertical, I don't analyze others that hard and I don't do a lot of Google searching otherwise in my free time.
Do you mean any new links pointed to a new page, or any new links to any page?

A site I do SEO for has been stuck at number 4 for a big money term and there's one site above me that I just can't figure out what they're doing. They have no where near the link profile, on page is not better, design is not better... I've exhausted almost everything (that I can think of). And, recently, they just jumped one of the biggest in our industry to hit number 2!

So yea, it's getting crazy!
 

blackwar85

BuSo Pro
Joined
Feb 19, 2015
Messages
53
Likes
49
Degree
0
I think the most likely, common answer is that your competitor is using PBN links that you can't see with link checkers like Ahrefs, Majestic.

Do you mean any new links pointed to a new page, or any new links to any page?

A site I do SEO for has been stuck at number 4 for a big money term and there's one site above me that I just can't figure out what they're doing. They have no where near the link profile, on page is not better, design is not better... I've exhausted almost everything (that I can think of). And, recently, they just jumped one of the biggest in our industry to hit number 2!

So yea, it's getting crazy!
 
Joined
Mar 15, 2018
Messages
25
Likes
9
Degree
0
They had top 3 and are getting superior user metrics because they're in the top 3 and now they'll stay there for that reason.
I am sure the algorithm expects better metrics in higher positions and values them accordingly. That's why you see the shuffling and I wouldn't be surprised that they show different SERPS to a small percentage of users on an ongoing basis to gather data before making big moves.

Like... if you wanted to build an authority site in X niche, you're better off building 10 of them and seeing which one Google magically prefers and then dropping the others and focusing on that one.
What I am seeing is that Google doesn't magically prefer one site over others. They seem to prefer "Average Article Y" on "Weak Page X" over 4 other articles from trusted authority sites. Meanwhile all the other articles from "Weak Page X" barely see any traffic at all.

Example:



Page has around 50 articles of equal length and quality. Most of them target "best product" keywords, with a handful of info posts.

One article is responsible for ~50.0K traffic. Google not only ranks it 1 for "best product", but also for just "product". There are 10 other pages going after this set of keywords, 8 out of them have a stronger overall BL profile and 2 of them even have decent links pointing at their version.

There is NOTHING that makes this exact article special. It's average freelancer content with stock images, 4k+ words. This is beating pages that usually crush the SERPS. Again, this is not one money article the whole page is optimized for. They target 50 other keywords with similar volume in the exact same way.

But over the long haul this is a bad move because users will find an alternative altogether.
Will they though? In my personal experience Google is still the best search engine out there, by far. I tried switching to alternatives a couple of times, but I always give up and go back to Google.

I believe we are seeing Google pull back on these quantitative metrics as it grows to have a deeper understanding of content.
I think so too. However, I'd say long form quality content still beats short form quality content for the most part in today's SERPS, even if the key search intend is satisfied with the short version. The X% that want to learn all the unnecessary small details will stay longer instead of having to go back to look at the next result.

Long-term optimizing for metrics and numbers instead of user intend is betting against Google's ability to eventually being able to 100% understand and value the quality of content.
 

JasonSc

BuSo Pro
Joined
Mar 30, 2016
Messages
86
Likes
123
Degree
0
Never were we asked to cover a topic in as many words as necessary, to place an order for a well researched article that covers the topic in a concise, thorough manner, or to pay writers solely based on the quality of their research and final product, not the length.
I wish I could find a writer which would agree to work on an article bases and not a word bases. I hate saying it needs to be “x” of words. This pigeonhole both the writer and me. If I think it needs to be 2000 words and in reality, it only needs to be 1000 words, then I get back 1000 words of good stuff and 1000 words of fluff.
 

Potatoe

BuSo Pro
Joined
Jan 4, 2016
Messages
410
Likes
573
Degree
2
@JasonSc

There's some people who will work by the hour. That gives more of an incentive for the writer to spend time on research, if they aren't factoring research into their 'per word' rate. That might be worth looking into.

The problem with a flat rate per article, if you're disregarding length entirely, is that it leaves too much leeway for either the client to feel like they're getting shortchanged, or the writer to end up doing too much extra work that they're not being compensated for.

If someone's regular rate is 5 cents per word and you're looking for articles that are exactly as long as they need to be and you're paying $75 per article, it works against you if the article ends up being 500 words, and it works against the writer if the topic requires 3000 words.

You could start with a retainer, and a list of topics, and not specify how long the articles need to be. This is what @RomesFall and I do, and I think we're both very happy with it. This works especially well when you're working with someone on an on-going basis because there's no reason to try to inflate the word count with any filler, since there's always plenty more topics to cover.
 
Joined
Dec 31, 2016
Messages
384
Likes
233
Degree
1
This might be a "duh" situation, but I just realized that all the sites on first page I am targeting have their man keyword first and isolated.

Like:

Widget - read more about Best Widget

Literally most of the time with a "-" after the keyword. Almost al of them. I was of the impression that kind of Title tag was going out of favor. I had "Best Widget - test and review of Widgets", stuff like that.

I had a health niche site ranking top 5, beating out a lot of authority semi-gov health sites. Now it's gone completely. Who ranks 10th for that phrase now? A literal dictionary site! "Illness - read about the definition of Illness". Wow! That's something huh? I could understand Wikipedia of course, but a dictionary site for a medical ailment?

Ok, maybe these very limited data points are just that. Maybe there's that AI thing going on. Machine learning. Remember when the Amazon AI just refused to hire women? Or when another AI became racist against blacks? Pattern recognition is not foolproof and can give you correlation/causation dilemmas.

Maybe these simple things like having "Keyword - " as the first thing in Title is such a thing. Maybe having "Keyword -" AND some kind of backlink authority is a PATTERN?

That's the thing with machine learning isn't it? Women are not pr. definition worse employees for Amazon, just that perhaps a majority of worse workers are women.

Having "Keyword -" and some kind of Authority is not pr. definition a relevant site, just that most relevant sites have "Keyword -" (meaning optimizing by someone who cares) and Authority.

2 cents from here, I have changed my Title tags to see how they respond.
 
Joined
Dec 31, 2016
Messages
384
Likes
233
Degree
1
@JasonSc
There's some people who will work by the hour. That gives more of an incentive for the writer to spend time on research, if they aren't factoring research into their 'per word' rate. That might be worth looking into.
I'm going to do a bit of content production on the side. "Rich content" is what I'll call it or something like that. Includes some keyword research, bringing in stuff from other sources, studies, images, vids etc.

I'm thinking about splitting the price into a pr. word price AND a pr. research hour price.

I'll do this, because it gives incentive for both me and the buyer.

The problem with merely a pr. word article is that it will lead to fluff, since you need to be compensated enough to research and you can only do that, by increasing pr. word price or by writing fluff. Not a good scenario.

I'll have a fixed rate pr. word and then an hourly research rate. This also selects for more long term, stable engagements, since the cost of research goes down as the words go up. You won't need as much research writing article number 20 on the topic.
 

Ryuzaki

女性以上のお金
Staff member
BuSo Pro
Digital Strategist
Joined
Sep 3, 2014
Messages
3,278
Likes
5,979
Degree
7
That's why you see the shuffling and I wouldn't be surprised that they show different SERPS to a small percentage of users on an ongoing basis to gather data before making big moves.
Absolutely. They split test constantly. There is no single static SERP. You can hit refresh and get the same results shuffled around. Then you think about "well, which data center shard did I connect to this time" and then localized and personalized results too.

Widget - read more about Best Widget
That dictionary site ranking makes me think Google is changing the intent of the search (according to user behavior I'd hope), where users don't want a huge explanation. They just need to know the very basic that a definition can provide.

As far as the title tags, I'd say having your key phrase earlier in the title tag is still stronger, but I'd say that's largely for the users and capturing their eyes as they quickly glance around. I think we can still get TONS of mileage out of writing very click-worthy title tags and getting higher CTRs in the SERPs. And as long as you deliver quality so there's not pogo-sticking and too high of a bounce rate with low time on page, you'll get a sustained boost (minus the shuffling in the quote above).
 

Calamari

BuSo Pro
Boot Camp
Joined
Oct 6, 2014
Messages
818
Likes
965
Degree
3
I think we can still get TONS of mileage out of writing very click-worthy title tags and getting higher CTRs in the SERPs. And as long as you deliver quality so there's not pogo-sticking and too high of a bounce rate with low time on page, you'll get a sustained boost
My experience confirms this. This is one of the best practices we should all be doing.
 
Joined
Nov 18, 2016
Messages
65
Likes
53
Degree
0
Absolutely. They split test constantly. There is no single static SERP. You can hit refresh and get the same results shuffled around. Then you think about "well, which data center shard did I connect to this time" and then localized and personalized results too.
I can confirm this.

I published an article about a month ago that was ranking in the 30s and 40s. Then 2nd of November , it suddenly went from 30s straight to top 3.



It stayed there for about a week before suddenly dropping back to the 40s.

Now i compared the Serps (before and after jump) and they were 2 COMPLETELY different results.

Whatever was done that gave me the boost on 2nd Nov was completely reversed to the result before then.

Something to note here :


I scored 3 links to that page between 20th - 29th October , so i initially thought the links gave me the boost. Or maybe the links worked but was suddenly held back (since we're suspecting Google now holds back link power) ?
 
Last edited:

shiftymcnab

BuSo Pro
Joined
Aug 16, 2017
Messages
59
Likes
37
Degree
0
I recently spent some time digesting the new google raters guidelines. Their quality scoring system is pretty scary and harsh on face value and could be miss interpreted by both rater and publisher. Very strict guidelines but also very lose. Which is the worrying part.

Regarding ‘author’ authority or having none, neither makes a difference if either are credible or return NO negative citations etc.

There was one paragraph repeated quite a bit that’s also worrying.. (loose representation because I’m not going back to read it again) “if you (rater) have a gut feeling it’s suspect, low quality, partially copied or similar, rate it very low regardless of quality or EAT results’ :confused:

They push raters heavily towards research of expertise, authoritativeness, trustworthiness. Which can be equally given ‘very high’ rating through either academic knowledge or life experience.

A health product review for example, real life testing of said products or years of testing is as worthy as being a Dr with a phd in said subject. (According to the guidlines ofc)

On the flip side, if your author is a known authority and impeccable but your website has tons of neg reviews or has been publicly trashed in some way. You’re getting the lowest score possible for that rated page. That also counts the other way round too.

You could craft an about us page highlighting your life long real life studies or testing etc within your niche/vertical. If no info can be found they will use that to decide if your site does have high EAT. (As long as your site isn’t trash and full of miss leading un readable garbage or bloated pointless text)

As @CCarter said. Word count or bulky content, even if it’s good in some aspects isn’t typically ‘adding value’ G, now wants more value even if that means less content. Unique ‘Widgets’ site components and functions could set you aside from the rest. If presented well ofc.

I’ve pretty much made a hash of reciting the above.. I’m sorry, I’m an idiot. :tongue:

If you’re chasing serps and/or looking to add value I’d read the revised google raters document.

Edit: I just read the Marie Hayes link in the newsstand section. Read that, then the revised google raters doc. +1 for this post. https://www.mariehaynes.com/the-september-27-early-october-algorithm-update-was-likely-about-googles-ability-to-assess-trust/
 
Last edited:
Joined
May 18, 2017
Messages
11
Likes
20
Degree
0
I was hit hard by the so called 'medic' update. 3 of my money sites went under. What I did to recover:

- I did not pay attention to any of those morons saying it is EAT and GQRG related. (set the right mentality)
- Fixed all my technical issues (a wrong canonical or 301 can tank your rankings)
- Trimmed all thin/dead content.
- Started a link building campaign and build several new PBNs (sites were mostly whitehat in the past 2 years, so i decided some juicy links were needed)
- made sure the articles left are good enough for the search intent. Some of them were not aligning with w/e was now in the serp so had to be rewritten.
- I added more media, facebook, reddit, quora, youtube embeds to the articles. Made some interactive charts with js to beautify my data
- aimed for a more deeper content than the competitors (e.g having more faqs, exploring synonyms, covering more ground supporting the main topic)
- checked the averages of html structure and structured data and made sure im not over with too much on w/e was ranking top 10.
- emulated brand search (this is a tricky one, if not done right it can tank you, so be warned)

took me about 2 months and a TON of work but since Oct all everything is going back up. Not yet at the levels where it was before Aug but hopefully will regain that by the end of the year.

Conclusion: SEO is getting harder with every fukin month lol. You almost need a 5 year practitioner degree to solve all that shit. And JhonMu was right for once when he said those are core updates. It just too much shit to be named a single reason for an update.
 
Joined
Sep 3, 2015
Messages
292
Likes
112
Degree
1
Anecdotally I sorted all my posts by length and views and deleted everything under 300 words that had less than XXX views in the last month. I have recovered somewhat from a very negative SERP trend that has been going on this entire. Unsure if related. Touch wood.
 

Calamari

BuSo Pro
Boot Camp
Joined
Oct 6, 2014
Messages
818
Likes
965
Degree
3
Anecdotally I sorted all my posts by length and views and deleted everything under 300 words that had less than XXX views in the last month. I have recovered somewhat from a very negative SERP trend that has been going on this entire. Unsure if related. Touch wood.
Sounds to me like you improved the quality of your overall site with this move. Well played. With a few more quality improvements I bet you can make even larger gains.
 
Joined
Nov 18, 2016
Messages
65
Likes
53
Degree
0
Anecdotally I sorted all my posts by length and views and deleted everything under 300 words that had less than XXX views in the last month. I have recovered somewhat from a very negative SERP trend that has been going on this entire. Unsure if related. Touch wood.
Hey!

Was there any specific tool or software you used to filter the content?
 

Tao

Joined
Apr 29, 2017
Messages
142
Likes
119
Degree
1
I am doing something similar at the moment to try to reverse my Medic killing.
  1. Installed WP Word Count to show posts with a low word count that can be deleted or added to.
  2. Identified posts that have little or no traffic using Google Analytics and Google Search Console. Either noindexed them or merged and 301'ed into another relevant post.
  3. Took the trial of SiteBulb to investigate errors on the site. Working through the list of issues to fix (not too many actually).
  4. Will start to use Squoosh to reduce image sizes. Will have to FTP download all images, squoosh them and put them back (hopefully will be able to bulk this somehow - or get my son to do it for me :wink: ).
  5. Reduced the amount of "clickbait" type article titles. This is a medical site, so should not be too sensationalist.
  6. Added a section to the top of each article with my name as author (linking to the about page) and an updated date.
  7. I need to look at internal linking to ensure I am doing this in a logical way, without using exact match keywords too much. Need to find a good way to visualise the existing links - I thought SiteBulb might help, but it is not quite as intuitive as I thought. have tried Screaming Frog too but quickly ran into the 500 url limit on the trial. Am actually considering writing a plugin to do this for me. One that loops through each post content (not the entire page) and extracts the internal links, with anchor text. I can then export to CSV and cut/filter how I see fit. From this data, I might also silo the site rather than it being a free for all.
So far, no improvements - but it is early days and I am struggling with stress/anxiety and depression again which does not help :smile:
I know I also need to focus on getting some backlinks too which I have just put off doing.
 

Ryuzaki

女性以上のお金
Staff member
BuSo Pro
Digital Strategist
Joined
Sep 3, 2014
Messages
3,278
Likes
5,979
Degree
7
@Tao, Instead of Squoosh, since you want to get image optimization done in bulk, I recommend hitting up Kraken.io and signing up for one of their cheap plans so you get an API key. Then they have a free Wordpress plugin that you can bulk "krak" every image in your media folder. This will save you a ton of time dealing with uploading and downloading in FTP.

Also, for anyone trying to get a Word Count of their posts, I took a function and improved it to not count HTML tags and shared it here. That'll work (I still use it) and keep you from dealing with a plugin. You can paste it in, use it, then comment it out or delete it when done.
 
Reactions: Tao

Ryuzaki

女性以上のお金
Staff member
BuSo Pro
Digital Strategist
Joined
Sep 3, 2014
Messages
3,278
Likes
5,979
Degree
7
I hope you guys are ready for ANOTHER update... maybe this one will fix some of the madness:

SERPWoo


Moz


SEMRush
 

mj22

Back War Mongering.
Joined
Jun 20, 2018
Messages
131
Likes
62
Degree
0
I think shit will start settling/rolling out soon. I been watching for the last month or so and from the outside looking in watching. I feel like some of the drops are due to an anchor/site over optimization penalty, other times I'm thinking some kinda panda penalty hit, and the rest some kind of Tom Foolery going on. I'm watching my own sites and other markets, I'd let Google get done fucking around in thier own playground and see what shakes loose. Not sure what else to do. Nothing has been corallating as to wtf is going on yet. Ride it out.

I went from pos 8 to pos 16 for a 180k/mo sv term, then on a different site I went from pos 14 to pos 30 term for a allergies kw, all I did was run a few links with naked urls. So who knows wtf is up, and the links are squeeky clean and white hat. So...
Actually the one site I didn't do jack shit to and it took a hit. The others I didn't do shit to either and they jumped up for mid level terms.

I'll report back anything I find, I'm still in observation mode.

My best site performing is a pmd with direct and partial match under 8% partial and under around 1% exact, the rest is mostly naked urls variations and generic anchors due to the URL being a pmd. I'm talking anchor distribution.
 

Prentzz

BuSo Pro
Joined
Mar 19, 2015
Messages
242
Likes
340
Degree
1
Note: This reads a little crazy, you can either buy it or not. I'm not going to out my own sites or niches to prove any of this. Some might be wrong, I've been wrong before, but it's what I've seen from diving deep in a variety of niches over the past 6 months. I'll also mention that I think Majestic is now superior to Ahrefs' because the algo is so heavily favoring relevancy and TF on Majestic is still the best metric for this (critical for building networks), if I could only have one I'd buy Majestic although Ahrefs' is still great.

I've been in the trenches hard the past couple months. Here's what I'll say:

The algorithm is absolutely terrible. It's difficult for me to find actual information that I want, G almost always guesses incorrectly instead of using the query itself, relies way too much on intent.

I'm 99% sure that nobody at G knows what the algorithm is now and how it's changing. Split tests every week, complete opposite of each other, all machine learning now. The algo is out of wack and they have to at least know that, but they are confident that it's going to improve gradually overtime.

I know this sounds crazy and people will be like 'how can you know this' etc, if you watch the SERPs enough you see patterns that don't look human, seems obvious to me that it's getting to a complete reliance on machine learning, which is why I think the G representatives CAN'T give accurate answers to updates, even if they want to.

I don't think it's just that Cutts is gone and these guys are keeping quiet, I think it's genuinely out of their control now and the algo is too complex, not like they can really pinpoint anything changing now.

Also, all this EAT stuff is complete bullshit. G would NEVER rely on something that's so insanely easy to manipulate. Just think about this seriously guys, PhD's guest post all the time, you seriously think G can tell that your site wasn't written by a PhD if you include his name and credentials? Be serious, please. EAT is all links. You can see this by looking at true genuine authority sites in the medical space which have had 'EAT' on their site forever and yet still saw -10 or -20 on the update. The algo always has relied on links heavily for tell-tale signs of authority and reputation, why would this change to scanning pages for freakin' names and degree titles when links are infinitely harder to manipulate and far more reliable.

I know people say they are using raters to get data for their machine learning, which might be the case, but this doesn't prove that this is then looking for similarities on-page, it could just as easily be looking for similarities in link profiles to optimize the algo for figuring out 'trust' and 'authority' through links, which again, are a far more reliable source of data than words on a page.

Note: I don't believe that including 'EAT' on your pages is a waste of time, it's just that I can nearly guarantee it won't have an impact on your rankings, anyone with data to the contrary please link me.

Relevance in links is becoming EVEN MORE important, particularly for breaking the sandbox. Have seen sites go from registering domain to page 1 in less than two months, sometimes weeks, with very few links (relative to niche) but 100% relevant baller links.

Fred is a relevancy and PBN filter to Penguin, if you can't see that now, I dunno guys. Fred only focuses on links, particularly on the anchor text you are using. But while Penguin was more about percentage of individual terms, Fred is more complex and seems to use the 'authority' and 'relevancy' of the linking page to determine how much, if any, of the link juice should be passed. Can test this easily by linking from multiple pages to one with random anchors and then searching in G for them at one week, two week, one month etc. Fred sandboxes links it doesn't know whether to trust, sometimes they never come out of the sandbox, sometimes only a percentage of the juice gets through. This is why sometimes you get links and you're 100% sure you'll rank with it because it's an easy term and the link is baller, then nothing happens or only a tiny move. Link juice was throttled or sandboxed for some reason. This is the 2.0 of the G patent that talks about rank transition, with it going up or down randomly to trick you.

PBN's are still alive and kicking but generic general PBN's are dead and gone, you won't rank anything competitive with these alone. On the other hand, relevant networks are still kicking butt, but the new algo is doing something funky with changes in content and the 'topics' of sites, can't share all the secretz guyzz, look in spam niches you'll see what I mean. They are exclusively targeting PBN's with algo now, nothing manual, which means they need to look for obvious signs that a domain is now a PBN.

Algo is also cracking down on pages that are irrelevant to the 'topic' of the domain. Not more ranking for breast augmentation on your legal affiliate site. Had multiple PAGES (not domains) -100 overnight in last month for keywords they had ranked on page one for many months and years.

Back to the trenches, never to return. SEO is frying my brain, maybe this is all garbage. Maybe not.
 

adamandeve

BuSo Pro
Joined
Jun 10, 2017
Messages
16
Likes
8
Degree
0
Here is an example of a YMYL website in the Health niche (I'm not going to out it) but it has a homepage and posts only.

There is:
No Contact Page
No Privacy Policy
No About Page
No Affiliate Disclaimer
No Terms of Service.

ZERO E.A.T on the website itself.

Current Ahrefs Stats as of right now

Organic keywords
39.2K +1K

Organic traffic
191K +4K

Traffic value
$73.8K

My opinion aligns with Prentzz, I agree there is a time and a place for EAT, and going forward it is probably something to add to your checklist to do when building a website.

Having said that a simple dig through the serps will show you that many of the websites that are ranking well have bugger all EAT but plenty of links, and in particular high PA links.

I think that relevant links and particularly high PA relevant links are working well, and I am also seeing websites ranking with high PA links that aren't relevant at all but the websites are still killing it.

In 5 years or so of being in this industry, I can't recall a time where I have seen so many new websites popping up in the Serps. You can look up a keyword and then do it a week later and there is a website you have never seen before ranking in the top 5.

Crazy Times
 
Joined
Nov 8, 2018
Messages
21
Likes
15
Degree
0
I am starting to wonder if the August update had **** all to do with the quality of sites. Maybe the update was only made to confuse and cause a storm in the SEO community. They demoted websites slightly random (if something can be slighty random).