[TEST] .info vs .review vs .me

Joined
Oct 18, 2016
Messages
10
Likes
10
Degree
0
Just a quick case study to start off.

This was just an observation when testing something else and thought I would share and open up for discussion.

The theory is that tlds do matter when it comes to ranking.

Everyone knows that you should stick to .com .co.uk s etc but really?

This came about because of the budget I'm putting towards these types of single variable tests (Basically, keeping everything the same except one variable which in this case is the tld).

You can pick up .info and such for under $1. So does it matter testing with these rather than the $10 .coms

Setup

3 sites - All registered at the same time

1 .info
1 .review
1 .me​

Domain names are random words that have no listings in google.

  • Each got its own hosting
  • Same html template
  • Title = Domain name
  • H1 = Domain name
  • 500 words of lorem ipsum

Given 3 days to settle then all submitted to google through the URL submitter.

Day later all indexed.

---

The actual test

Picked a new random word that has no results (for example rjklhosrrtq)

3 pages on each site.
  • url = example.com/rjklhosrrtq-gfhurh
    second word also no matches
  • title = rjklhosrrtq
  • h1 = rjklhosrrtq
  • 6th word in the text = rjklhosrrtq

Altogether 9 pages.

Left to settle - then submitted to google.

11 days later - 6 pages indexed

3 other pages resubmitted.

----

Current listings

  1. .info
  2. .info
  3. .info
  4. .review
  5. .review
  6. .review

Needs that 3 or 4th domain in there to really see. Obviously some .coms / .co.uk / .com.au / .org

Like most things, this needs more tests more sites and then repeating.

Your thoughts on these types of single variant tests?
 
Be interesting to see the results of this. Don't think the difference between .me .review and .info would have too much between them with all sites built up targeting specific keyword.

Think quality content + links would be the deciding factor as to which ranks higher.
 
You're going about this all wrong... I used to do this, to the tune of keeping hundreds of millions of urls in the index. Choose a tld that will resonate with your peoples, and build something of value.
 
You're going about this all wrong... I used to do this, to the tune of keeping hundreds of millions of urls in the index. Choose a tld that will resonate with your peoples, and build something of value.
This is not really about building out assets. It is about testing and seeing what are really ranking factors and what is hearsay.

What did you learn from keeping millions of urls in the index?
 
What did you learn from keeping millions of urls in the index?

Use churn and burn money to develop a long term asset. I didn't like the idea of chasing my tail for the rest of my life. (i.e. - worrying about which tld does what)

Pick one. Stick to it. Build something for people. Sell it.
 
This is not really about building out assets. It is about testing and seeing what are really ranking factors and what is hearsay.

I don't get it - the SERPs aren't in a vacuum. Let's say there is a small boost for .info to the tune of 0.00001 over .me - none of that will matter when you get to SERPs to aged domains and powerful authority. That 0.00001 advantage will mean nothing when I can throw the word "rjklhosrrtq" on a random inner page on one of my sites that have been in the SERPs for years and has a solid backlinking strategy and outrank all of your sites within 24 hours.

This experiment is the equivalent of trying to select a green or red shirt to go pick up girls at a bar. Realistically the color shirt won't matter when you are inside a dimly-lit setting such as a nightclub or bar, and are going up against guys that are quicker or smart or simply less socially awkward than your green versus red shirt scenario.

Basically this is a "No Money" project. No serious money will result in this, nor a serious conclusion that will lead to results that matter in any term when taken out of the vacuum.

Really think about it OP, are you going to waste weeks, months, or years on your life doing these small individual tests that result in no money in the long run when you could have created a long-term asset NOW and start generating real revenue.

Lets say this experiment takes you 20 hours - you'll make zero money, and drive zero traffic. Meanwhile you could have spent your time learning techniques like traffic leaking with that 20 hours, done a ton of real experiments on humans that drive in direct traffic and figure out how to grow a traffic leak of 100 to 500 to 1,000 to 10,000 and then to 100,000 visitors like @built did over the course of 6 months. That's a skill that you can use to generate serious traffic and serious revenue when applied right. Learning the small ranking boost between .info versus .me - seems like a waste of your time for, again, No Money.

Your time on this earth is limited, you might as well spend it on things that will make lasting impact to your bottomline and live a life you want to.

You'll be more fulfilled by doing experiments on making money. @built went through the traffic leaks and generated up to 100,000 in a single day (https://www.buildersociety.com/threads/right-lets-try-this-again-6-months-to-a-milli-2-0.1025/), then he switched themes and worked his way up till he sold $1.6K in products a day and then more (https://www.buildersociety.com/threads/cc9-donna.554/page-9).

Those are the types of experiments that create real value for yourself and anyone that would hire you for work. I could name 100 people that would hire @built for one-time or long-term projects simple because of those two skills he acquired. I can't think of anyone that would want the vice versa.
 
Thanks for the feedback. CCarter, honored that you would drop your thoughts into this thread.

Just to say I do build out websites. This is a side project and I believe you should always be testing things.

OK there is no money to be made in this test.

But there is money to be made here in multipe ways.

For example: you could sell these test as reports. What really works in SEO?

Secondly, this I will admit again this is not the greatest theory to be testing as no one builds on sites like these, give me a brandable aged domain with a healthy backlink profile.

Even better let go out and find an already aged active domain that is being looked after that you can offer cash for.

Also, if you can prove something like h1 tags are a major ranking factor and so are h4 tags
h2 and h3 don't do shit. Would that not be helpful to your on page SEO?

This was just an observational test that I thought I could turn into a good first post here as I wanted to give a little something back. This main test is actually for a clickbot test and seeing whether serp click-through rate can be manipulated. Is that not helpful to trying to rank? If it effects the ranking in a positive way or a negative way that would be something you could use in real life. The only result that would be bad is no result at all.

Should I try this in the wild? I think you need to control the variables. And not mess with other peoples work.

I know it works with human clicks especially to video. So will it work with emulated humans?

Have to test and see.
 
Last edited:
I like these kind of tests, if only for the entertainment and curiosity value.

I see why some people are having the reaction they are, because this was the kind of stuff we did pre-Penguin when you could exploit every little variable and make a good chunk of cash. Those days are definitely gone.

The problem with these kind of experiments with a single expermental variable is that you can't have a single experimental variable in a black box environment. One thing you can't have controlled for even though you tried is that when you submitted the URL's and domains to Google, they still weren't crawled and indexed at the same time. So now what you're left with is 3 gTLD domain extensions and the time factor, and the time factor can explain your results as validly as as the extensions.

Your clickbot test, yeah that 100% influences the SERPs. That's what @chanilla was just saying here in this thread. Google will randomly cycle the results to get more engagement metrics for each result in each position. This is also why if you go to Microworkers you'll see a lot of jobs for "Search this brand name and click this result, linger for 30 seconds and click another link and linger for 30 seconds." It's a way of getting it done without bots and proxies. Online flash game sites go nuts over these types of things, trying to buy 4 letter domains and influence CTR so they come up in auto-suggest, etc.

My guess is, if you determine .info and .me offer a 0.0001% boost over .review, none of that matters once you start doing activities that offer even 0.001% boost. If each link and social signal and article you publish even offers 0.001% boost, that's already exponentially more PER unit than your domain, which you can only choose once. In the end, it's inconsequential.

One thing that's certain that Google hasn't owned up to is that they will devalue domain extensions that spammers start using in droves because they're cheap. .info was like that for a while. Right now, .xyz is like that.

So I think it's more a matter of devalued than valued, and also I'd just recommend sticking to .com / .net / .org, and especially .com if you can if you're building a real brand. THAT will help your CTR and type-in traffic and all that more than any bot could, especially once you have some recognition.
 
Thanks for coming in too Ryuzaki.

You are right, these small variables do not have massive effects. And the ranking algorithm is a combination that changes with each search query.

I still have to say that there is a place for single variable tests. Otherwise you are just blindly following what others tell you.

With regards to @chanilla cycling of serps, you see something in the wild and think, has anyone else seen this? What could cause such a phenomenon? Can I test it? Can I take advantage?

I think the cycling of serps could be down to many things. For example it seems like a bandit algorithm. It keeps the top positions open and then cycles through to see which gets the best user metrics. If the site performs then it will increase in rank.

What that performance metric is who knows. On site time, termination of search, bounce rate.

Probably or definitely a hell of a lot more complicated than that, Google does have the best minds working on this.

But what if you knew it was coming? You see it then add in some extra paid traffic to get user metrics up. Would that result in staying in that position?

You would have to experiment and observe it in a somewhat measured way. Controlling as many variables as possible.

It could be that every 4 months someone spams the hell out of the top 5 results or Google trying to obfuscate their results just to mess with seos. These are all theory until you test.
 
One thing that's certain that Google hasn't owned up to is that they will devalue domain extensions that spammers start using in droves because they're cheap. .info was like that for a while. Right now, .xyz is like that.

Those cheap domains already start with negative reputation when hitting a lot of email spam filters from ESPs (Email Server Providers). For example we had a customer setup an account with a .science domain and attempt to email us some questions. I happened to be browsing my junk folder and noticed this user got put in spam - it was an newly installed SpamAssassin configuration, so it was by default.

Now if the cheap domains already have a problem inboxing emails, considering Google runs Gmail.com the largest email provider, it's not a hard leap to conclude that there will be some negative reputations for these cheap domains to start with.

It's an uphill battle, which doesn't make sense.

Let's say you get one of these 48 cent domains like .science - that one is already flagged as spam in emails and starts off with EVERYONE having to whitelist you. But if the logic is to buy 100 of these domains for $48 - what's the long term game plan there? Are you seriously going to build out 100 domains and run them as businesses successfully? That's a ton of work and you already know getting into the SERPs with one of those domains is already an uphill battle.

You can make a whole lot more money doing a lot less work by putting work on a long-term asset instead of the "lets rank like its 2007" game that is impossible to win. Plus you can't even send am email out to your customers/clients from these things without sounding unprofessional "please check your junk mail."

My problem isn't with the experiments or case studies, I'd love to see more of these. However this particular one is targeting something that cannot make anyone any money any way you slice it with whatever conclusion is met.

Also, if you can prove something like h1 tags are a major ranking factor and so are h4 tags
h2 and h3 don't do shit. Would that not be helpful to your on page SEO?

I find it difficult to believe there are people in SEO that do not realize that H1 tags are more valuable and have a bigger impact than H2-H6 tags. It's in most basic SEO newbie guides:

Heading tags (not to be confused with the <head> HTML tag or HTTP headers) are used to present structure on the page to users. There are six sizes of heading tags, beginning with <h1>, the most important, and ending with <h6>, the least important (1).

Sauce: Search Engine Optimization Starter Guide - by Google

What I would like to see, and I am in the course of proving, is a study on domain Age as a factor of SEO rankings, things along those lines are more measurable.
 
One thing you can't have controlled for even though you tried is that when you submitted the URL's and domains to Google, they still weren't crawled and indexed at the same time.

What you are saying here is that crawl rate and time of index is a ranking factor - this is easily testable and if not a factor then the tests should stand.

Online flash game sites go nuts over these types of things, trying to buy 4 letter domains and influence CTR so they come up in auto-suggest, etc.

Going to look for more about this.

So I think it's more a matter of devalued than valued, and also I'd just recommend sticking to .com / .net / .org, and especially .com if you can if you're building a real brand. THAT will help your CTR and type-in traffic and all that more than any bot could, especially once you have some recognition.

Totally agree - however for testing purposes el cheapo is my name.

One thing that's certain that Google hasn't owned up to is that they will devalue domain extensions that spammers start using in droves because they're cheap. .info was like that for a while. Right now, .xyz is like that.

Not only google, fb hates them too. And it would appear that ESPs dislike them too.

With regards to SEO according to John Mueller

Q: How will new gTLDs affect search? Is Google changing the search algorithm to favor these TLDs? How important are they really in search?

A: Overall, our systems treat new gTLDs like other gTLDs (like .com and .org). Keywords in a TLD do not give any advantage or disadvantage in search.

@CCarter You are right that it is not hard to conclude that they have some negative reputation. If this test was a full blown test over 30 or so domains repeated etc then you could see this for sure.

Common sense would say when building on a new site never get a shit tld.

However this particular one is targeting something that cannot make anyone any money any way you slice it with whatever conclusion is met.

Yes, this one is shitty. In my defense of this, it was a throw away observation whilst setting up for other testing that I thought would make an interesting thread.

Regarding the H1 tags - my examples should be a little better

Also, if you can prove something like nofollow links are a major ranking factor. Would that not be helpful to your link building? Or links on noindexed pages.

In conclusion..

Current thinking is - don't buy shit tlds.
 
First off, @Tenison, I want to say that I applaud the initiative and method of thinking. What things people could accomplish, if more took such an approach of actually doing.

That said, I see several major flaws. To give you an idea on my background, I've worked most sides, including and up to enterprise-level SEO. I've dealt with stuff in the big data realm quite a bit. I've seen strange SERP behaviors, about as strange as things can get.

To be blunt, the sheer number of variables and ranking factors at play makes most tests like this an exercise in futility. I know from first hand experience, and it's destroyed my sanity. :wink: There are so many considerations to take into account, to plan a test like this, it really is insane.

Example. Have WHOIS privacy? Doesn't matter, Google is a domain registrar... Hosted with the same host? Same C block? B block? What if there are SERP filters specifically for lorem ipsum or other entirely gibberish site footprints (I could think of some easy footprints for that...)? I mean the list goes on. This stuff will drive you insane. Trust me on this. Just fire up netstat or any other traffic monitor and take note of just how hard it is to avoid inadvertently sending data to the four horseman of the big data-ocalypse.

I've actually seen some other people perform very similar tests. One guy was waxing eloquent over it, as if he was some guru. Newbs were eating it up like it was sacred geometry or something. I'm over here like, "Yeah, but we already know SERP behaviors differ down to the keyword level....so results = irrelevant."

What would be more effective is actual investment in legit sites, trying to normalize certain controllable aspects, spreading the sites across different niches, but in such a manner that they are comparable in nature (Blogs, info-based, all focusing on longtail, or whatever other factors you want to try standardizing with). The idea would be, see if cross-topical theme sites display similar effects from similar tests. Even then, that might only be an indicator that those specific niches, at that time, may have used similar algorithm factors and modifiers. There's nothing that says those won't be totally different for each niche the next day.

At the end of the day, this is a big data, machine learning, and increasingly AI driven problem. Trying to accurately decipher it to a degree of specificity can really only be done from a similar, technological standpoint. We need to consider the fact that these companies spend BILLIONS on creating these technologies, let alone algorithms specifically designed to fool and provoke reactions from you. At best, we can hope to see some rough correlations, relevant to our particular niches, that seem to work well more often than not. It's about the best we can hope for.
 
Back