New SEO Insight Into Why Some New Sites Seem Blessed While Others Are Doomed

Ryuzaki

女性以上のお金
Staff member
BuSo Pro
Digital Strategist
Joined
Sep 3, 2014
Messages
3,911
Likes
7,438
Degree
8
Okay... now we're getting somewhere!

Backstory & Advice
I've complained at length about something I've seen in the SERPs that drives me nuts. The basic explanation is that some sites seem to be "blessed" and absolutely rock the space that they exist in. They'll rank for everything and anything as long as they produce content for it. The worst part is, these sites blow. They're low grade, low usefulness, crappy content, low effort trash.

The same happens with good sites and we don't complain because it's justified. The sites are nice.

Because I've never been able to figure out why or how they're blessed, my advice (which I don't myself take due to the workload and infrastructure required) would be to create 5 versions of any authority site you want to build and see which one Google randomly "favors" as being better, even though they're all pretty much the same. Then roll with that one.

Webmaster Hangouts
In the most recent Google Webmaster Hangout, someone asked John Mueller:

“Theoretically, could a website that’s only one to two weeks old rank for top positions on Google for ultra-competitive head keywords say for example for shoes with significantly better content only, considering it’s the most important part of the core algorithm? If not then clearly time is a factor when it comes to ranking pages for highly competitive areas no matter how good they are unless new pages are created on already established websites.”

It's a nice question because it's phrased in such a way to place Mueller (the new Matt Cutts) in a corner live on air. To paraphrase it's basically, "You guys insist content is the most important ranking factor, but we only see new content ranking for big terms if they're published on old and powerful domains. That means time is a ranking factor. Change my mind." They don't want to admit too much about time being how they fight spammers, because it means they can't actually handle it and it'll also incentivize spammers to just scale harder and wait.

John buys some time at the start and throws in some plausible deniability by saying "this is all theoretical and a lot of things could be at play." Then he breaks character and says "But in practice things aren't so theoretical." Then he's right back to the game, "So I don't really know if there's a good answer that would be useful to give for something like this."

I think Mueller and Cutts and any other of the psyop spokespeople for Google are good people and want to be helpful. Mueller basically proves this by dropping some new hot fire information on us.

How Google Ranks New Websites
Mueller needs to explain away time as a ranking factor. He ends up doing so by being generous to those of us paying attention. Explaining away time means explaining that brand new sites can rank for competitive terms nearly immediately. This circles us back around to my huge complaint about some sites being blessed. He ends up saying:

“We use lots of different factors when it comes to crawling, indexing and ranking. And sometimes that means that completely new websites show up very visibly in search. Sometimes it also means that it can take a bit of time for things to settle down.”

If you read between the lines there and do some deductive reasoning, if a site is brand new, time is definitely not a factor in the initial considerations of 1) crawling, 2) indexing, and 3) ranking. That means it has to boil down to two things: 1) on-page SEO, and 2) technical SEO (so speed, code efficiency, bandwidth needs, information architecture, etc).

So basically there's something going on with brand new sites where something matches up in terms of on-page SEO and tech SEO that makes Google think the site will be a good performer in the future, and they then go ahead and rank that site accordingly. If it doesn't perform, the site loses rankings. If it does perform, it sticks pretty highly.

Back in the day we called this the honeymoon period. We knew what it was. You index high, they get some data, and then you slip back (hopefully not very far). But now we're getting to exactly which pages get chosen for a honeymoon period. It doesn't seem like as many as it used to be, because if it was, with the exponentially growing size of the web, then the SERPs would be trash. You can't test every page out.

Mueller says exactly that. He says they make estimates and test pages and sites out based on those estimates, giving them more visibility. They can then win or lose. Losing can mean being doomed, winning can mean being blessed. This lines up perfectly with my observation. Some sites get doomed or blessed, but not all sites, just some. This seems to be why.

What it ultimately means is you can become blessed if you can figure out what on-page matters and what technical SEO matters compared to data from similar sites in your niche and how they went on to perform, and then choose the right search terms to optimize for so you perform well once given extra visibility.

Mueller also goes on to say:

"And it can also be that maybe you’re shown less visibly in the beginning and as we understand your website and how it fits in with the rest of the web then we can kind of adjust that."
That sounds like an example of how the majority of new sites are treated. It's not that they understand how the website fits in with the others. It's simply that they don't test them out and don't have any signals on them yet, so you have to do the slow crawl up the rankings (because time is, in fact, a ranking factor). We've always called that the sandbox.

The question then becomes, for the majority of sites, how can you give Google some signals to work from if you can't rank? I'd guess that having Google Analytics is one way, using Google Fonts might be a way, having visitors that use Chrome is one way. Asking people to search specifically for your site through Google is one way. Doing marketing and traffic leaks and PPC definitely always kicks starts things.

They can't only be using data they get through their search engine or nobody new would ever get exposure and they'd rarely ever get a link to get exposure, while those with existing exposure would keep getting all the natural links.

Anyways, back to the honeymoon period. We know why and the results of how they do it, but how do they actually, mechanically do it? There's a patent called "Ranking Search Results" that explains they have a score modification engine (which I believe is how Panda and Penguin work, as side-algorithms or layers on top of the core algo). They generate an initial score and then tweak the scores respective of the query and based on a "plurality of resources."


This is food for thought, made more difficult by my rambling. But if anyone was so inclined, they could, given the time and resources, probably determine which sites in their niche are the baselines by rolling out sites based on all of the successful ones and seeing which get blessed or doomed, then repeat to confirm. And then go to town setting up winners.
 

secretagentdad

Keyword Sheeter - The Bestest Keyword Tool
BuSo Pro
Joined
Jan 29, 2015
Messages
235
Likes
278
Degree
1
Brand search volume from clean real users.

Is the current hole in the google boost network system and speaks volumes to what the related factors are.


Prove me wrong.
 

bernard

BuSo Pro
Joined
Dec 31, 2016
Messages
755
Likes
574
Degree
2
That's interesting.

I've seen different cases.

One of my first sites in the health niche, ranked high almost right away, top 3 for a one word keyword, and stayed first page for 6 months. This with basically a new site up against very old and very established huge health sites.

What was that about? Well, I've always considered this "honeymoon period" to be about testing with live audience. If you fit some kind of criteria, you get some real intent visitors to check how they respond to your site. Could there be a situation where your feedback is just so much better than the competition, that Google lets you stay? Say you have some groundbreaking material. You break some news from Hollywood. Everyone googles for it to find the original source, not being satisfied with the many copycats ranking on authority alone. I think that's possible. If people open a lot of results for a query, but keep coming back to yours. Or in my case, the established sites simply didn't respond to "URGENCY" of the intent. I fulfilled the urgency, stopped them from opening more tabs.

On the other hand, some niches and some sites seem to have zero honeymoon. My fitness equipment site is like this. Just won't budge. I contribute that to a "satisfied serp". Intent gets fulfilled and then some.

Like secretagentdad wrote, if people type in your brand name and a keyword, maybe that's one of those secret sauce ingredients. Like "TMZ celeb scandals" for keyword "celeb scandals".
 

secretagentdad

Keyword Sheeter - The Bestest Keyword Tool
BuSo Pro
Joined
Jan 29, 2015
Messages
235
Likes
278
Degree
1
That's interesting.

I've seen different cases.

One of my first sites in the health niche, ranked high almost right away, top 3 for a one word keyword, and stayed first page for 6 months. This with basically a new site up against very old and very established huge health sites.

What was that about? Well, I've always considered this "honeymoon period" to be about testing with live audience. If you fit some kind of criteria, you get some real intent visitors to check how they respond to your site. Could there be a situation where your feedback is just so much better than the competition, that Google lets you stay? Say you have some groundbreaking material. You break some news from Hollywood. Everyone googles for it to find the original source, not being satisfied with the many copycats ranking on authority alone. I think that's possible. If people open a lot of results for a query, but keep coming back to yours. Or in my case, the established sites simply didn't respond to "URGENCY" of the intent. I fulfilled the urgency, stopped them from opening more tabs.

On the other hand, some niches and some sites seem to have zero honeymoon. My fitness equipment site is like this. Just won't budge. I contribute that to a "satisfied serp". Intent gets fulfilled and then some.

Like secretagentdad wrote, if people type in your brand name and a keyword, maybe that's one of those secret sauce ingredients. Like "TMZ celeb scandals" for keyword "celeb scandals".
Welp. I’m pretty much going all in on theories resembling this one. I think it works. Publickish case study I guess. Lol
I think user experience and brand power Trump all.
 
Last edited:

eliquid

SERPWoo
Digital Strategist
Joined
Nov 26, 2014
Messages
677
Likes
1,636
Degree
3
Me and @CCarter have pushed this for a long time, that time is a ranking factor. Years actually we have pushed it.

Everyone thought we were stupid and crazy.

Data doesn't lie though.

We are probably one of the few that even such data sets AND THE WANT to dive into them to help isolate and show it was a factor. Kinda the same as with NOINDEX links that everyone thought we were stupid about.

You can attach sub items to TIME such as brand exposure and link build up, but without TIME you won't achieve such things. Many experienced webmasters and devs have the resources to push thousands of dollars into links 1st and 2nd month of a new site out of the gate, but without TIME those sites don't rank in the first 60 days....
 

CCarter

If they cease to believe in u, do u even exist?
Staff member
BuSo Pro
Boot Camp
Digital Strategist
Joined
Sep 15, 2014
Messages
2,551
Likes
5,873
Degree
6
When you are in an industry or anything for so long you start to instinctively know what to look for. When I see websites I'm exampling within the first couple of minutes I can tell whether the site performs well in the search engines or not.

It's a combination of experience and knowing small subtleties to look for.

For example if I can't find an eCommerce website's address or contact info, they are most likely not going to rank locally for local SEO terms. Sounds logical right? But a lot of people miss these small things.

Another example are blogs that remove the dates - well you are essentially eliminating yourself from freshness factors altogether. I understand the logic of removing dates from comments, but removing dates from blog posts or post themselves reduces your chances of showing up in the organic results.

I can tell you this because when I search for stuff, and sometimes obscure coding stuff, 95%+ of the results have dates. Some state "2019" others go back to "2013". All those pieces of data are important for the end users to know whether this article they are reading is relevant or not.

Example if I have an Ubuntu and MYSQL issue, reading an article from 2013 doesn't help when the versions of software they were using is way behind. So I have to find a fresher solution. All the results ALWAYS have dates in the search results or the page - yet SEOs are doing the exact opposite and removing dates. Then they can't figure out why they aren't ranking or their site drops in the next algo - it's not an individual factor it might be that your site simply is not "keeping up" with the standards.

Like @secretagentdad said, the biggest factor that I see that can overcome these time limits are brand mentions and Google getting "signals" that this brand is more than just a weak website. It's why I push for social media mentions, it's why I push to get YouTube videos going, cause there are literally customized organic results for the YouTube Carousel, and Twitter Feeds, and other aspects showing up in Google. ALL of those are based off of freshness factors meaning - wait for it... Dates!

A lot of this isn't rocket surgery, you can't just "publish a post" and wait for Google, or send some links and again wait for Google. You have to promote the post through out the internet and Google will notice that and connect it to your brand. And then when Google knows you as being "someone" - you'll be able to rank your future content easier.

And guess what - the only way to get "Brand search volume from clean real users." - by... "MARKETING" and "PROMOTING" your brand outside the search results.

[insert obligator CCarter gif]
 

secretagentdad

Keyword Sheeter - The Bestest Keyword Tool
BuSo Pro
Joined
Jan 29, 2015
Messages
235
Likes
278
Degree
1
ughhh ^^^^^^
I hate when you're right about something I've fought to avoid doing.
I'm gonna start using dates this year.


Also get in here plebs. The best way to get good at seo is to OP. OPS are OP get it.....
Post some theories and get some holes shot in them. Or shit on my shitty approaches.
Criticism and text wall theories work better when we beat up each others approaches.
Actually doing the work is expensive so there is real value in trying to actually participate in the discussion.
We got a working framework with a few holes you can drive a truck through being outed.
Make some fucking money and stop being so poor.
Who knows, maybe you'll get invited to the good meet ups and stuff if you post a bit.


More importantly,
Smaller search engines don't have the manual intervention component google does and they're starting to gain traction.

The signals they have to work with are what people type into their box, and the internet backlink graphs.
You can run types of content that are currently difficult to rank in google fine on a lot of them since their index clean up and templating procedures are more simplistic out of necessity.
 
Last edited:

stackcash

I Sell Words
BuSo Pro
Digital Strategist
Joined
Nov 19, 2014
Messages
709
Likes
1,152
Degree
3
the sandbox.
LOL. I feel like this OP was an extension of our conversation the other day.

But, yes, I 100% agree with what @Ryuzaki is saying here.

I forget where I read it, but there was a data-based case study that was put out sometime over the past year that showed that the average domain age for ALL serps is 3.3 years. I have found this to be true.

Using my main site that I've been working on for 2 years now, here's what I've experienced:
  • There are new sites that appear to be outliers and get blessed almost immediately. Some of those sites have stuck, and some have dwindled. I'm assuming the sites that have stuck have proven that they deserve to stick. Some of these blessed sites are less than DR10, have 3x fewer links than us, and continue to rank hard for short tails in the industry.
  • It appears that all other new sites that didn't trip the "blessed" filter hit the sandbox.
  • The sandbox appears to be a spectrum, with the main determinant being time.
    • Example: When the site first launched, new posts would index in the 50's, pull back even further, and then slowly creep up over time. After a year, new posts would index in the 20's, pull back to the 50's, and then creep over time. After two years, new posts would index in spots 11-15, pull back to the 30's, and creep up over time.
    • Example: We've experienced several "big" algorithm updates that have caused rankings/traffic to fluctuate 5%-10%. At worst, we had a 30% swing after one update. With that being said, we did not change our strategy in any way during the existence of the site, and did not change our strategy as a result of any single update. We've come back from every update stronger than before without changing anything we were doing SEO wise. The only difference is that we are now an older site.
    • Example: More links does not compensate for less time. We have significantly more links (at a higher quality) than some older sites in our niche, and continue to rank behind them.
  • Slamming a page with quality links to "force" it to rank faster only seems to work if the SERP is lacking quality pages (aka low keyword difficulty). Slamming a page that targets a high difficulty keyword seems to cause rankings to DECREASE for a period (6+ months). Only after a period of time do those high-difficulty keyword targets break through their previous ceiling.
  • Site freshness and individual page freshness continue to pack the biggest punch in ranking improvements. The more often we post on the site, the bigger ranking and traffic gains we see. And, the more often we update existing content, the faster we see that content rank well.
  • Being first to the punch in regards to ranking for new/trending keywords is a gold mine. We run Google Alerts to surface new companies in our industry that we can review. When we are successful in posting the FIRST review article for a company, it seems like it's almost impossible to be beat. Using this strategy, we ranked one review article that represents 30% of our existing revenue, and another one that represents about 5% of our existing revenue. And, to date, the keywords for these new review opportunities continue to show less than 100 for search volume on industry-standard tools.
it's not an individual factor it might be that your site simply is not "keeping up" with the standards.
This sounds like what all these new correlational tools like Surfer and POP are trying to resolve.

Taking an 80/20 approach here, I feel like these tools can get you 80% of the way there in regards to keeping up with the standards. The rest of the way will need to be covered via manual analysis... like you did in finding the dates issue.
 

Ryuzaki

女性以上のお金
Staff member
BuSo Pro
Digital Strategist
Joined
Sep 3, 2014
Messages
3,911
Likes
7,438
Degree
8
Opening Post Summary
I feel like I need to summarize the opening post for the benefit of the TL;DR crowd.

I'm not talking about certain posts doing well on an existing site, or domains doing well or bad after time, or even how to accelerate out of the sandbox.

I'm talking specifically about brand new domains that get "blessed or doomed." Blessed ones not only bypass the sandbox but bypass everyone in the SERPs and slam right to the top and stick. Some sites get doomed and never have any chance of ranking, even when they've done nothing wrong.

The point was that Mueller has admitted that Google (through seemingly random sampling) estimates what brand new sites user metrics might be by comparing them to other successful sites. The only thing they have to go off of is on-page SEO and technical SEO, and I'm leaning more towards tech SEO being the main factor in this.

Figuring out exactly what it is they like for a niche means easy and fast domination, saving time waiting and money on links. You can dominate through pure content development.

Some blessed sites don't last because they don't satisfy the users and their user metrics cause them to tank. Doomed sites never even get a chance to prove their worth.

It's a two-pronged approach. If you can figure out how to get blessed, you win. If you can figure out that your site is doomed without waiting 12 months, you win. Actually testing these things isn't really pragmatic or worth the time.

I think the best approach would be to develop 5 or even 10 versions of a site. Different themes, same keywords being targeted, fairly similar content. One or two are going to pop, most will do the normal slow-grow, and one might get the doom stamp. From there you can focus on the winners and use the others as a PBN or simply get rid of them.

___________


Other Stuff
Brand search volume from clean real users.
I knew a guy that used to build flash game portals on 4 letter domains. He didn't care what they were as long as they were 4 letter dot coms. Then what he'd do is head over to Microworkers and pay a zillion people a nickel each to go type his 4 letter brand name (XGRZ.com for example). He was influencing the auto-suggest, but he felt that cascaded out into SEO benefits.

It would seem to be key to have it sustained, and the only way to pull that off is to be a real brand through marketing and superior everything. You have to get in front of millions of eyeballs and get people actually searching your brand and get an actual search volume on the books. (As well as doing other obvious brand footprint things across the web).

Well, I've always considered this "honeymoon period" to be about testing with live audience.
Yeah that's exactly what I'm getting at, which Mueller has admitted to now. But back in the day everyone got the honeymoon period and it was on a page-by-page basis. It seems now, since the internet is exponentially larger, they have to dole it out randomly and not to pages by to domains. Which explains the "blessed or doomed" observations I've been crying about the past year here. It's nice to get confirmation again that I'm not imagining things.

Two other notable pieces of confirmation was when I swore they inverted the index one time. I thought it was some crafty ploy to get a baseline of user metrics for bad sites. Matt Cutts said some doofus just accidentally replaced a > with a < once. The other was Mueller, at a private dinner, confirming my sitewide Panda Quality Score theory, that finally leaked out years later.

Which all supports this idea...

Post some theories and get some holes shot in them
There's tidbits I see people drop on here all the time that are worth millions of bucks to the person that will apply them. One in particular I always remember is how @CCarter discovered that like 90% of Sears Canada's emails were going to spam, during the start of the mad holiday rush. If they read it, it would have probably been a billion dollar revenue difference at least, a problem any number of us could solve for them.

Stuff like our page speed talks can add an unbelievable amount of money to an e-commerce store's revenue overnight if they're already hitting big numbers. The on-page discussion, the freshness factor thread, the index bloat talk, lots of this stuff can change businesses over night. Let alone the Crash Course for newbies. Go look at the list of threads there and look how many reads the early days get and how it peters out towards the end. Free money sitting there for the taking and people are too lazy to read, let alone type and ask questions.
 
Joined
Nov 25, 2014
Messages
6
Likes
11
Degree
0
Ok, I’m going to be honest, I haven’t been deeply involved in SEO for a while now. If you know me, you know that I am working for a fast growing startup. One of the things that really sets us apart from competitors is that we have a CEO that is very active on Twitter. While the company only started in the middle of 2017, that has caused a massive amount of social mentions, brand searches, etc.

In 2018 I did a small test of transcribing a couple of our training videos and slapping them up on our website to see what kind of traction we could get from an organic perspective. Keep in mind these posts are not optimized, and were something that took me about 5 minutes of cleaning up the transcript to get posted.

I went and checked today and for some queries we’re pushing out sites like Stack Overflow, which is a behemoth in the space. We’re also young compared to a lot of these companies/competitors. One of our main competitors (that was acquired) was started in 2012, but we’re ranking decently for queries where they should be crushing us, and we haven’t even looked at SEO. (Which is going to change very soon now that I’ve looked at this data.)

Takeaway is that brand signals seem to be assisting in ranking. Hard to say 100% because the Twitter activity has resulted in a number of posts being written about us on various large blogs/websites. So we’ve gotten the brand stuff, that has pushed some nice links.
 

CCarter

If they cease to believe in u, do u even exist?
Staff member
BuSo Pro
Boot Camp
Digital Strategist
Joined
Sep 15, 2014
Messages
2,551
Likes
5,873
Degree
6
Cora says they test against 2040 ranking factors, and recently put out a list of the top ones:


Magically there is data to provide that I'm right about dates. Yet all these blogs have gone out of their way to remove publishing dates from their content. Or better yet, people do the lazy thing of just updating the date and not the actually content even though Google's threshold for duplicate content is 40% unique.


I think I should come out with an SEO blog or something.
 

mikey3times

BuSo Pro
Joined
Aug 25, 2018
Messages
113
Likes
88
Degree
0
people do the lazy thing of just updating the date and not the actually content even though Google's threshold for duplicate content is 40% unique.
Are you suggesting that you need to add/edit/remove 40% of an article for Google to consider it to be fresh?

I use my CMS’s updated date for inserting the date. I don’t use published date so that there aren’t competing dates. So the updated date changes if I fix a typo, add a clarifying graphic, or rework some text. Given the evergreen nature of the site, there isn’t need to change much, but I don’t want my 2012 article next to someone’s 2020 article so I make small edits here and there.

I don’t think I’ve ever been penalized for small changes with an updated date. Maybe google knows the difference between “reference” topics and “news” topics.
 

CCarter

If they cease to believe in u, do u even exist?
Staff member
BuSo Pro
Boot Camp
Digital Strategist
Joined
Sep 15, 2014
Messages
2,551
Likes
5,873
Degree
6
Are you suggesting that you need to add/edit/remove 40% of an article for Google to consider it to be fresh?
To clarify, if you have a page, then duplicate the page and then update at least 40% of the content Google will market this page as unique.

An example in the wild: Think of all those recipe sites that have similar recipes, they figured it out and started putting their life stories above the recipe cause otherwise it would be considered duplicate content.
 

secretagentdad

Keyword Sheeter - The Bestest Keyword Tool
BuSo Pro
Joined
Jan 29, 2015
Messages
235
Likes
278
Degree
1
Here's my current voodoo recipe.

Search on google, click Tools - Anytime, Set Date range.
Looking at your distribution of rankings under various different timelines can be really enlightening.
I try to do it every week in my bigger niches.
Once you get that blessed status you pop when you publish.
You can identify when its time to go ham on the content cuz you pop at page one when ever you write something within your short tail keyword tree.
When you get a good description and document combination you work your way up with time instead of down. Double down on the winners and just delete the stuff that's sinking.

If you watch the time series, you can quickly spot which competitors content is being taken seriously for testing. Then you can blog about their brands with their BRAND or product NAME VS your brand or product name and try to get into their branded auto suggest.

When you delete content regularly you seem to get some kinda crawl priority boost.
This appears to be less true now than a year back but I think it still works.
I'm speculating that it might not be a real time perk any more.
A lot of stuff appears to have moved behind some kinda QA index these days.
 
Last edited:

BCN

$$$$ ¯\_(ツ)_/¯ $$$$
Joined
Dec 26, 2015
Messages
135
Likes
156
Degree
1
To clarify, if you have a page, then duplicate the page and then update at least 40% of the content Google will market this page as unique.

An example in the wild: Think of all those recipe sites that have similar recipes, they figured it out and started putting their life stories above the recipe cause otherwise it would be considered duplicate content.
I fucking hate that. I want to cook some sauce, and they tell me their whole life story and how their grandmother used to make it when she came as an immigrant from ... Just give me the recipe.

On the topic of dates though: a lot of the dev articles have old publish dates, but keep getting updated, and also provide an 'updated date'.
 
Joined
Sep 27, 2016
Messages
9
Likes
4
Degree
0
I think the best approach would be to develop 5 or even 10 versions of a site. Different themes, same keywords being targeted, fairly similar content.
A practical question... in my country you need an imprint on your site (your bzw name, address, tax number) and I wonder if it would harm my sites if multiple sites in the same niche with similiar content would pop up at the same time. I guess it would be very obv for google to see that I try to hit an algo winner. The question is if this will have negative effects or not, hm.

It would seem to be key to have it sustained, and the only way to pull that off is to be a real brand through marketing and superior everything. You have to get in front of millions of eyeballs and get people actually searching your brand and get an actual search volume on the books. (As well as doing other obvious brand footprint things across the web).
If you are in a smaller niche without any strong competing brands wouldn't it be enough to use the brand searches mainly at the beginning to increase the odds of getting a blessed site?
 

Ryuzaki

女性以上のお金
Staff member
BuSo Pro
Digital Strategist
Joined
Sep 3, 2014
Messages
3,911
Likes
7,438
Degree
8
I wonder if it would harm my sites if multiple sites in the same niche with similiar content would pop up at the same time.
I doubt they're running any checks like that. Most giant publishing brands have tons of sites in the same niches. Not just verticals, but niches. Scroll down through these brands: https://www.futureplc.com/brands/ They all have the same business details at the bottom like you're describing.

I'd probably host them on separate cheap servers until I know which one is the best, then move the winner over to my main server, though. And drop the others into the aether. This process could take a long time. You'll still have to contend with the sandbox if one doesn't magically pop through. One will still be more favored than the others, and waiting to find out which it is could be a year in duration even.

Unless you're the kind of guy already pumping out boatloads of sites, I'm not sure how practical any of this is. We have to find a way to stack the deck in our favor or we're just pissing in the wind and hoping, and potentially hoping for a long time in order to get nothing out of it. That's the danger of SEO altogether.

Maybe one thing to do would be to identify a blessed site in the niche you want to go into, and then find out what theme they're using for their CMS and use the same one. You can change colors and whatever to make it different, but the point would be that you mimc their HTML layout for the most part. You can even identify what plugins they're using and how they're setting them up, what areas of the site are indexable, what's in the robots.txt, etc. You can look at their sitemap, and so forth, and mimic them as much as possible on the technical side, while masking what you're doing on the design side superficially.