What's your take on Google's Search Algorithm right now?

Joined
Oct 12, 2018
Messages
16
Likes
6
Degree
0
Here is an example of a YMYL website in the Health niche (I'm not going to out it) but it has a homepage and posts only.

There is:
No Contact Page
No Privacy Policy
No About Page
No Affiliate Disclaimer
No Terms of Service.

ZERO E.A.T on the website itself.

Current Ahrefs Stats as of right now

Organic keywords
39.2K +1K

Organic traffic
191K +4K

Traffic value
$73.8K

My opinion aligns with Prentzz, I agree there is a time and a place for EAT, and going forward it is probably something to add to your checklist to do when building a website.

Having said that a simple dig through the serps will show you that many of the websites that are ranking well have bugger all EAT but plenty of links, and in particular high PA links.

I think that relevant links and particularly high PA relevant links are working well, and I am also seeing websites ranking with high PA links that aren't relevant at all but the websites are still killing it.

In 5 years or so of being in this industry, I can't recall a time where I have seen so many new websites popping up in the Serps. You can look up a keyword and then do it a week later and there is a website you have never seen before ranking in the top 5.

Crazy Times
Is it a site built on auction domains? I see many popping up overnight.
 

blackwar85

BuSo Pro
Joined
Feb 19, 2015
Messages
53
Likes
49
Degree
0
I can add that I have lately noticed very frequent keyword fluctuations for terms that rank in 1-3 spots jumping to 5-7.th spot one day and then back to 1-3 another day. Seems like a lot of shuffling happening.

I'm using Proranktracker.
 

adamandeve

BuSo Pro
Joined
Jun 10, 2017
Messages
16
Likes
8
Degree
0
zentrader99 The domain was first registered at the end of May 2018 according to whois.com however if you put it in waybackmachine the domain goes back to 2009 and was a forum in the niche.

blackwar85 I have also noticed a lot of fluctuating in the serps, seems like the norm now as it happens so often.
 
Joined
Jan 13, 2017
Messages
26
Likes
11
Degree
0
Maybe the update was only made to confuse and cause a storm in the SEO community.
I think this is a BIG part of recent changes, and this is supported by a general lack of industry consensus. Prior, when specific changes were made to combat specific types of effective algo manipulation it was very easy to reverse engineer and remedy. Cats move slow, mice move fast.

However, with enough measured randomization (I do believe something can be slightly random, but I also failed statistics in uni) the mouse doesn't know how to react. He is confused and running in circles, thus becoming easy prey for the cat.
 

eliquid

SERPWoo
Digital Strategist
Joined
Nov 26, 2014
Messages
612
Likes
1,431
Degree
3
I've said for a long time that randomness is how SEO will one day fail.

There is no one that can say which SERP ranking is the correct one, except Google.

Therefor, if they were to randomly shuffle the SERPs... hardly no one would question if those rankings are right or not, and all the SEOs would have to throw out any notion of what causes rankings to go up or down.

It would also ( if done slightly differently ) almost completely destroy the SEO tools market as well and cause many of them to go bankrupt.

.
 

becool

BuSo Pro
Joined
May 10, 2018
Messages
55
Likes
36
Degree
0
I've said for a long time that randomness is how SEO will one day fail.

There is no one that can say which SERP ranking is the correct one, except Google.

Therefor, if they were to randomly shuffle the SERPs... hardly no one would question if those rankings are right or not, and all the SEOs would have to throw out any notion of what causes rankings to go up or down.

It would also ( if done slightly differently ) almost completely destroy the SEO tools market as well and cause many of them to go bankrupt.

.
This is interesting. I’m thinking out loud here. The benefit (from Google’s vantage) of randomizing the SERPs is clear in that SEOs will lose (partially) their ability to game the system by manipulating the SERPs. However, how do you randomize the SERPs without completely sabotaging the user experience and the quality of the SERPs?

As an aside, I don’t think Google will ever willingly sabotage an entire industry, namely, the SEO industry. SEO and the guys who do SEO (i.e. you) are essential to Google’s algorithms and the improvement of its algorithms. They/you funnel money to AdWords and Adsense effectively. SEOs, whether they want to or not, provide Google with a layer of quality assurance and feedback. For instance, black hat SEOs popularized spam tactics to such astronomical levels that Google took notice and then Google spent years tweaking its algorithms to bloackade those spammy efforts. (See WickedFire.) The significant rollouts/updates to the algorithms in the recent past (e.g. Panda, Fred, etc.) were in response, largely, to weaknesses in the algorithms which you detected and exploited. You/SEOs found holes with respect to keyword density, link building to scale and so on that Google didn’t know about and you therefore made the algorithm updates possible. I don’t think users provide enough feedback regarding search. I think Google needs SEO and SEOs because by constantly trying to game the system, you are Google’s best source of testing, quality assurance and feedback, as well as revenue, amongst other things.

(I don’t mean the references to “you” to be accusatory. I hope that’s clear.)
 

eliquid

SERPWoo
Digital Strategist
Joined
Nov 26, 2014
Messages
612
Likes
1,431
Degree
3
This is interesting. I’m thinking out loud here. The benefit (from Google’s vantage) of randomizing the SERPs is clear in that SEOs will lose (partially) their ability to game the system by manipulating the SERPs. However, how do you randomize the SERPs without completely sabotaging the user experience and the quality of the SERPs?
Thing of it like this.

You have a bucket of websites that rank for a keyword. That keyword can be "blue widgets".

In this bucket is 4,000,000 URLs/sites that could rank for blue widgets.

But only 30 of those 4,000,000 are high quality, authority, or worth of ranking for the top30 of that blue widget keyword. The rest of the 4,000,000 could be anywhere past the top 30 ( 31-4,000,000 rankings ).

Now, you are Google and want to disrupt SEO and rank trackers.

All you gotta do is jumble up randomly the top30. It doesn't have to be massive jumble where the site that was 30th yesterday, is 1st today.

It doesn't have to stay jumbled every day and impacts last for 20 days straight.

It could just be slight movements every 3rd day, that last only 2 days. Ok, now replace 3rd day with X random and 2 days with Y random....

Lets say its whoever was 1st is now 5th and whoever was 8th is now 1st and that lasts for 2 days ( or X random days ). 3 days ( or X random days ) later another movement happens where the person that was 12th is now 4th and the person that was 2nd is now 15th.

It's pretty easy to do. Everyone else is just jumbling all day every day anyways ( 31st to 4,000,000 ).

Do this enough times and the SEO's won't know why things are ranking the way they do. Every 90 days do a major shift and call it a known algo change and label it a cute animal name to even more confuse SEO peeps. Now they blame everything thats moving randomly the cute animal and not the fact you are just randomly doing things to piss them off and throw them off their game. They go down a whole other rabbit hole now running around in circles.

In this way you are not sabotaging user experience at all. Someone who is 5th, could probably also be 1st ( and prob was at one time anyways ). Someone that is 12th, could also prob have be 5th. We aren't taking a crappy spun content site from 34th page and putting them on the 1st page.

We are just jumbling up randomly the already good auth sites that could be anywhere on the first page anyways.

As an aside, I don’t think Google will ever willingly sabotage an entire industry, namely, the SEO industry. SEO and the guys who do SEO (i.e. you) are essential to Google’s algorithms and the improvement of its algorithms. They/you funnel money to AdWords and Adsense effectively. SEOs, whether they want to or not, provide Google with a layer of quality assurance and feedback. For instance, black hat SEOs popularized spam tactics to such astronomical levels that Google took notice and then Google spent years tweaking its algorithms to bloackade those spammy efforts. (See WickedFire.) The significant rollouts/updates to the algorithms in the recent past (e.g. Panda, Fred, etc.) were in response, largely, to weaknesses in the algorithms which you detected and exploited. You/SEOs found holes with respect to keyword density, link building to scale and so on that Google didn’t know about and you therefore made the algorithm updates possible. I don’t think users provide enough feedback regarding search. I think Google needs SEO and SEOs because by constantly trying to game the system, you are Google’s best source of testing, quality assurance and feedback, as well as revenue, amongst other things.

(I don’t mean the references to “you” to be accusatory. I hope that’s clear.)
I understand your position very well, but I would have to disagree.

Google never needed SEOs for this. Maybe at an elementary level or as a filler until other tech evolved, but they didn't need us for this.

If none of us were finding loopholes and abusing them, their results would have been perfect for normal users of their app. They only fixed issues, because our end result was making their results crappy for end users. Without us, there would have been no crappy sites to filter out.

So the end result was not us making their system better for their end users because it was bad, the end result was them fixing a "virus" that we created to pollute their SERPs and they needed to clean it up.

Without us, there would have been no crappy sites that Google would have had to clean up, meaning their results would have been fine before we dirtied it up.

Plus, enough time and tech would have passed to bypass it anyways. AI and Machine Learning, submitted bad sites to Webmaster Console, page speed and Google Analytics tie in for bounce rates, https, common sense things like Privacy policies and terms of service pages on the domain, etc.

All we did was create more work for them to clean up. Them making their systems "better" wasn't because they were broke, but because we dirty'd them up and someone had to now go and clean it up.

That's how I see it and lived it.

But everyone has their own theory, mine could be completely wrong.
 
Last edited: