Tidbits From Google's Gary Illyes AMA on Reddit

Ryuzaki

お前はもう死んでいる
Moderator
BuSo Pro
Digital Strategist
Joined
Sep 3, 2014
Messages
6,127
Likes
12,747
Degree
9
Source: Gary Illyes Reddit AMA

It just ended. Went for 2 hours. He didn't get to all the questions, including some that we probably wanted answered most. But I want to share some stuff from it as I read it. (Halfway through, I'm back to point out how many times people caught him saying the exact opposite of things he said elsewhere and he'd say "Well, that's missing context" :D

Introduction
Gary introduces himself: been with Google for 8 years, works on Web Search including Googlebot, Caffiene, and other unnamed stuff. He's focused on Google Images & Video now.

Hreflang
Using hreflang offers no ranking benefit. They pull the best ranking page (an EN page) and then if they see a hreflang tag (ES tag) and see the searcher is a Spanish speaker, they may show the ES page if they determine that's better for the searcher. No ranking boosts, just a way to connect searcher to the better page for them. A later question is does Google support "es-419" to target LATAM and the answer is "no." That's Latin America (I had to Google it). I guess that's similar to American English vs British English.

SERP User Interactions
Stuff like pogosticking, CTR, dwell time, Gary says they aren't ranking signals, but evaluation signals. This basically means that they inform the algorithms with these evaluation signals. In my opinion this is basically the same thing, with evaluation signals perhaps not being live. I could think of it as a side-algorithm like Penguin used to be. He calls this "user interaction" stuff conspiracy, "Fishkin's new theory," and mentions he ignores Twitter a lot because Guru stuff like this gets thrown at him as proof he's lying.

Brand Mentions
Not used as a ranking signal but in "Entity Determination" as an example of something it is used for.

Internal Link Over-Optimization Penalty
"No, you can abuse your internal links as much as you want AFAIK." Take that to mean what you will. We saw some problems arise from this on giant sites during the Panda era. Just like he says "as far as I know" I'll give myself the same kind of out: maybe we misinterpreted things then.

Author E-A-T
He kind of confirms this exists by saying that if Expert A links to Expert B, Expert B's expertise is algorithmically increased. He also says this is oversimplified but correct.

Cross-Site Canonicals
If your post gets syndicated to another site and that site lists your original as the canonical URL... It's the page that already shows in the SERPs that gets the benefit from the links. He doesn't say that it's the original that gets the benefit, but the one with the current exposure. He doesn't specify. That could be the original, but Google doesn't care about that as much as providing quality results. Kind of scary, the implications. He said he wouldn't say more because spammers would have a field day, which seems to confirm what I'm getting at here. Big sites can steal your content and get the benefit of any links you get.

Spinning Content
If you use Google's NLP API and can produce content that's indistinguishable from human content, then "go for it."

Funnel Content & Increasing Authority on a Topic
If a site targets all aspects of the buyer's journal in the funnel, they are not seen as more authoritative than one that only goes after products or whatever. Seems like they focus more on pages than trying to profile the site's content by intent and then evaluating if they cover the whole funnel or not. However, by having more content on a single topic, you do increase the authority of the site and other pages on the topic.

gTLD's & Mismatching Location
Don't use a geo-targeted by default domain extension like .lk and then set it in Search Console for USA. You'd get a boost in Sri Lanka but start from a "penalized position" for US rankings. Duh.

Website Features
Google Web Search doesn't care about things like whether or not you have a phone number and address listed on your site, though stuff like Google Maps may use it. (This seems not true based on the rater's guidelines that he pushed people towards in the AMA. It's back to semantics. It "informs the algorithm but isn't used in the algorithm").

NoIndex / Follow Pages
John Mueller said something that caused Yoast to change their plugin at one point, which was that if you noindex but follow your category pages, Google will stop visiting them and stop following those links (turning them nofollow was the conclusion at the time). Gary Illyes says that Google wouldn't 404 these pages, they just don't index them, but Googlebot would still follow the links on the pages, as we'd expect. And if they're linked from other pages, like pagination or whatever, then you can expect them to continue to be crawled (but not indexed) and the links followed.

New Indexing API
Apparently Wix has access to this and they're working with Yoast to include it in their plugin but Joost over-spoke when he announced it. It may not be the same thing, and also Google isn't sure "pushing" content to this API is going to be as good as the way they currently "pull" pages to index through discovery and RSS feeds or whatever, I guess. Seems a ripe way to abuse Googlebot.

New Site Spam Filters
There's anti-spam mechanisms in place he won't reveal, (and there's no sandbox, this is "the sandbox"). Other people mention stuff like using keywords that trip filters, launching and getting a boatload of links (ones that aren't vetted by big authority links too, like news spreading), etc. But that's all speculation.

External Review Sites & aggregateRating Schema
Some sites are pulling reviews from external sites and then using that schema to get Ratings Stars in the SERPs. You're supposed to use reviews from the page itself. Gary Illyes is aware and asked the manual actions team to take a look the other day. Some sites are going to have a bad time coming soon.

QAPage Structured Data
Same as above. If users can't ask questions, submit answers, vote on them, etc., then don't use the QA schema. Like, don't use it on your FAQ page. He's sending this to the manual actions team.

Are Some Domains Unrepairable?
"I don't mean manual penalties, but for some reason the domain won't be able to rank given its history of sketchy content / behavior?" No.

Secret SEO Recipe
Does Gary know some super secret stuff that he'd never share that he could implement as an SEO that would make insane money? Yes.

Are 301's and 302's the Same Now?
Yes.

Folder Level Signals
Like sub-domains, if you have obvious sub-folder hierarchies like example.com/username/blog Google will eventually chunk that site into a lot of mini-sites that live under the main domain.

Old 301 Redirects
He's saying that 301's pass their metrics once and Google understands that and eventually, like 5 years down the road, you can drop the 301 and the receiving page won't lose the metrics. It's not clear if this is for external 301's or only internals. Personally, I'd leave them in place.

How Will Ranking Websites Change in the Future?
"i really wish SEOs went back to the basics (i.e. MAKE THAT DAMN SITE CRAWLABLE) instead of focusing on silly updates and made up terms by the rank trackers, and that they talked more with the developers of the website once done with the first part of this sentence."

Can You Tell Us Any Unrevealed Ranking Signals?
"Pornyness"

Ghost Links / Link Echoes
Do old links that disappear still help you? Nope.

Link Velocity
This is made up and not a real thing, yet if he gave answers spammers would exploit it... so it's real but not real but Google doesn't lie.

Sentiment Analysis
Gary "doesn't know what that's about" and an extra question tagged on is answered that "nofollow links don't have value."

Content Pruning
It has a slight positive effect to prune content that gets no traffic. An earlier question was asked about dropping 30,000 URLs and he said you'll prob see a ranking drop due to loss of page rank (if the pages had backlinks) and it'll end up removing some internal links and orphaning some pages more, etc. So two different answers, possibly different due to the scale and the site structure.

Merging Similar Pages
The only benefit you get is from consolidating page rank, but I'm saying to not forget that each page generates a tiny bit of it's own page rank too.

Conversions on Page
"I assume Google is smart enough to understand conversions on our sites, does the percentage of conversions count as a ranking factor?" Nope. No, as in they don't use conversions as a ranking signal. My thought: Can they measure it? Maybe but not accurately enough to trust the data unless you feed it to them through Analytics, and not all sites do this.
 
Nice write-up!

SERP User Interactions
Stuff like pogosticking, CTR, dwell time, Gary says they aren't ranking signals, but evaluation signals.

He says tomato, we say tomahto.

Brand Mentions
Not used as a ranking signal but in "Entity Determination" as an example of something it is used for.

Entities being puny brand, small brand, medium brand, and BIG brand :wink:
 
"nofollow links don't have value."

I don't think that is accurate. I know that's always been their stated position on it, so I understand why that was his answer, but I feel like I have seen evidence to the contrary over the years.
 
Ghost Links / Link Echoes
Do old links that disappear still help you? Nope.

This is interesting, looking at other recent case studies I am wondering if this was a new thing from recent algo updates since we had older case studies like the MOZ one saying the exact opposite.

Folder Level Signals
Like sub-domains, if you have obvious sub-folder hierarchies like example.com/username/blog Google will eventually chunk that site into a lot of mini-sites that live under the main domain.

Damn well I guess I have to start using folder hierarchy's more for my authority sites
 
Halfway through, I'm back to point out how many times people caught him saying the exact opposite of things he said elsewhere and he'd say "Well, that's missing context" :D

I have a difficult time believing that any single person at Google is capable of answering all or even most of the questions that this guy was asked. I think it's a complicated algorithm or, better yet, series of algorithms that, I would imagine, are beyond the understanding of a single person and are maintained by a number of people.

Moreover, I have a difficult time believing much of what Google representatives divulge via Tweet, AMAs and the like. Frankly, I think there's a blanket of ambiguity, hypocrisy and mistruths that the folks at Google both intentionally and probably unintentionally cast when providing tidbits of insight regarding Google search. It largely sounds like repetitive nonsense to me. Concepts like "EAT" and the like are, from my vantage, extremely unclear and also blown out of proportion and are the subject of immense extrapolation, to the point of absurdity. For instance, aren't expertise, authority and trust one and the same? Can't "EAT" be described as "credibility" or simply "authoritativeness" or something similar? Can't we describe these concepts more succinctly so that people can actually understand them and then apply them? Similarly, does Google's guidelines really need to be 100+ pages? From my perspective, the document doesn't offer 100+ pages of value, but rather could succinctly be summarized in a few short pages.

@Ryuzaki's post above is helpful and I don't mean to detract from it, but my irritation with Google's nonsense is growing exponentially. I am not suggesting Google ought to disseminate its playbook, but I could do without (what I perceive as) the unnecessary superfluous and lengthy nonsense Google continues to provide.
 
I have a difficult time believing that any single person at Google is capable of answering all or even most of the questions that this guy was asked. I think it's a complicated algorithm or, better yet, series of algorithms that, I would imagine, are beyond the understanding of a single person and are maintained by a number of people.

Moreover, I have a difficult time believing much of what Google representatives divulge via Tweet, AMAs and the like. Frankly, I think there's a blanket of ambiguity, hypocrisy and mistruths that the folks at Google both intentionally and probably unintentionally cast when providing tidbits of insight regarding Google search. It largely sounds like repetitive nonsense to me. Concepts like "EAT" and the like are, from my vantage, extremely unclear and also blown out of proportion and are the subject of immense extrapolation, to the point of absurdity. For instance, aren't expertise, authority and trust one and the same? Can't "EAT" be described as "credibility" or simply "authoritativeness" or something similar? Can't we describe these concepts more succinctly so that people can actually understand them and then apply them? Similarly, does Google's guidelines really need to be 100+ pages? From my perspective, the document doesn't offer 100+ pages of value, but rather could succinctly be summarized in a few short pages.

If you read through the lines, he's basically telling us most of what we need to know:

How Will Ranking Websites Change in the Future?
"i really wish SEOs went back to the basics (i.e. MAKE THAT DAMN SITE CRAWLABLE) instead of focusing on silly updates and made up terms by the rank trackers, and that they talked more with the developers of the website once done with the first part of this sentence."

Go back to fundamentals...

After that, test, test, test.
 
Back