How To Find Terms With Low DR Sites Ranking 1-5 in SERPs?

Joined
Feb 23, 2021
Messages
49
Likes
30
Degree
0
Is there a tool or combination of tools where you can find kws that have low DR sites ranking from 1-5 in the SERPs?

I'd like to enter in a topic, seed keyword and/or list of keywords into a tool, then have it keep or find terms that have 2 or more sites in positions 1-5 that have a sub 10 DR* or so. These numbers are just figurative*

I've used ahrefs, surfer and kwsheeter so far and to my understanding, they only have their own form of KD which doesn't specifically filter out terms that are dominated by large, high DR sites. I.e. some terms will have a KD of 0, but 4/5 sites ranking have DRs higher than 90+.

Open to buying any tool, building around an API, outsourcing (though time intensive) etc but figured I'd see if there's something like this out of the box, or try to figure out the best avenue to create such as tool.
 
What's your intent with this, hit it with AI content on an aged domain? Sometimes low DR sites have great content, sometimes there'll be Reddit/Quora/Forum on top with high DR that you can outrank easily that you'll miss with this method. Or a high dr site with half baked content that you can compete with.
 
What's your intent with this, hit it with AI content on an aged domain? Sometimes low DR sites have great content, sometimes there'll be Reddit/Quora/Forum on top with high DR that you can outrank easily that you'll miss with this method. Or a high dr site with half baked content that you can compete with.
This. No tool can replace good old fashion know how and your eye. It's kind of like a saw. Anyone can use it but only a skilled craftsman can make good furniture with it.

The replacement at my old company, who was my "boss" but was an idiot, thought that tools were his power. I told him otherwise, as the tool is only as good as its user. He didn't like it but now they have great Surfer optimization which is like +5% in traffic... but they only push out 0 articles/month and can get much more reach if they publish more articles. Short sighted IMO. You gotta have updated content and also push out new content to reach new niches. But this is a tangent about tools and people.

But, to answer OPs question, the tool you're looking for is called Google. You google the keyword yourself and see if you can rank or not. That's all. Good old Google. It's free too!

If you're asking for a tool to just buy something that will do this for you, then you're lazy. Plain and simple. I'm very anti-consumerism and you can't just buy a tool to do everything for you. That's what a salesman wants you to believe.
 
You can use the Content Explorer in AHREFs and accomplish this to a degree. I do not believe you can do so at the term level, but you can filter domains by DR and see what keywords they're ranking 1-5 for, do the same with their competitors, etc.
 
What's your intent with this, hit it with AI content on an aged domain? Sometimes low DR sites have great content, sometimes there'll be Reddit/Quora/Forum on top with high DR that you can outrank easily that you'll miss with this method. Or a high dr site with half baked content that you can compete with.
The intent is to find easy wins for new sites of ours with a low DR...not some AI bs. I'm aware there's larger sites with "half-baked content" that can be taken down and have done so for many terms where intent wasn't met, or where we're more optimized for the query.

Finding kws where forums are ranking in SERPS is really easy through ahrefs so don't really care about those kws.

Furthermore, when you're competing with absolute powerhouses -- think webmd -- then a few low DR sites ranking well in the SERPS is a good indication that we can rank too... regardless if their content is great or not, because I'm confident we can always make ours better.

But, to answer OPs question, the tool you're looking for is called Google. You google the keyword yourself and see if you can rank or not. That's all. Good old Google. It's free too!
This isn't a short sighted attempt to ignore checking the serps manually. It's merely an attempt to find low hanging fruit.

If you're asking for a tool to just buy something that will do this for you, then you're lazy.
This is like saying "why pay for ahrefs when you can simply build your own kw research tool". Not very helpful and willing to build it if need be or use their API. Just seeing if anyone has done it before.

If you can pay for a tool that achieves a specific goal and it saves you time, that's a no brainer.
 
If you can pay for a tool that achieves a specific goal and it saves you time, that's a no brainer.
That's what people who make these SEO tools want you to believe and I've used my fair share of tools. Some tools calculate keyword difficulty by just adding up the Domain Rank of the top 10 and dividing it by 10. Not kidding. Their keyword difficulty score is useless.

And I'm not telling you to make your own tool. You didn't understand me. I'm saying that the tool to use is google.com. Google is a tool and you can use it for Search engine optimization. You don't need a tool to decipher Google for you and you can do it yourself without assistance.

like @Poplanu said, you can just Google the keyword by hand and check if you can outrank the results there or not, if you know what you can compete against or not. If your competitive advantage is long form content and top 3 is short form content but high DA, you can compete. If top 3 is forum posts and you can compete with just a 500 word blog post on a new domain. If top 3 is WebMD and you have a MD writing for you and your domain is DA60, you can compete if you make a better article with better on-page SEO.

No tool is going to gauge your money site's ranking ability and SEO skill and business' competitive advantage and tell you if you can rank for a keyword or not. No tool can do this and this is why my SEO is good and I'm not afraid of other SEOs entering my niche. They can't compete against me by simply subscribing to a $99/month SaaS. Plain and simple. Skill wins any day.
 
You can use the Content Explorer in AHREFs and accomplish this to a degree. I do not believe you can do so at the term level, but you can filter domains by DR and see what keywords they're ranking 1-5 for, do the same with their competitors, etc.
Thanks. I've played with content explorer some but never for this particular need. Looking at it now and I think this is the best option and will just have to stack on filters. I'll report back if I find anything useful that's not blatantly obvious
 
You don't need a tool to decipher Google for you and you can do it yourself without assistance.
It's something we've seen work for fresh sites of ours and there's a zillion other factors to consider rather than a simple "if a low dr site ranking in serps, then it must be easy to rank". Which goes to say, you have a point but we're still going to use this as one of many factors when filtering out kws for new sites.
 
DIY would proooobably be scrapebox + Ahrefs API but sounds like a lot of work involved and expensive
 
DIY would proooobably be scrapebox + Ahrefs API but sounds like a lot of work involved and expensive
That's not how I would do it. I'd make a Google Sheets with AirTable and have my keyword guy answer questions. It'll be stuff like:
  1. Considering how we write content, is there any content here that is bad that we can write better content than?
  2. Is there UGC on the top 10?
  3. Are there competitors where our link building would overtake their link building and we will outrank them simply by more backlinks, if everything is the same?
  4. Given the volume and 1-3, how likely are we to get an ROI on this if we need 1 sale a month to make a good investment? 1, 2, 3, 4, 5?
100% qualitative data and I'd narrow it down to a quantitative score. IMO, like learning a language, it is more of a "feeling" and we're trying to systematise that "feeling" with our processes.

It takes about a week to train someone on a task like this and to make it worth it, you gotta pump out hundreds of articles a week. But, at the end of the day, this guy is worth his weight in gold and no tool can replace him. This is mentoring and training your staff to build up your company's human capital.
 
@MWWWconquest
I don't have an exact solution for this but would love to have something.

However, here's what I do:
1) I create a list of the keywords I rank for in top10 from ahrefs and GSC
2) Put it in scrapebox, and bulk harvest the top5 results for all the keys.
3) Remove duplicate domains. When I start with a list of 1K keys, I usually end up with ~500-1000 domains.
4) Put it in ahrefs batch analysis. You can do only 200 domains at one time. So do this 3-5 times and copy the data in 1 excel.
5) Sort by DR. I keep everything that is <=DR30
6) Manually check all the DR<=30 domains. Last week when I did this, I got 250 domains that were <=DR30 from a list of 800.
Usually, 10-20% will be very usable (niche-related) domains, and 10-20% will be somewhat good.
The rest is crap, but it's part of the game.

I did this last week, and now I have 300 new keywords to write about and 100-200 more if I take another shoulder niche.
This is what I will do to take a website from $10K to $20K, and this is what I did to take it from $3K to $10K. It's not rocket science.

If a DR<=30 can rank for the key, you can also rank if you have a half-decent site.
Probably out of 300 articles, I will be able to get 150-200 in top 10, which is good enough.
Not perfect, but good enough.
Don't do everything manually. It's too much work.
 
Last edited:
What is UGS?
UGC, not UGS. It means User Generated Content (forums, comments, etc.) Q&A sites often get a pass since they're "moderated" in the sense that people upvote and downvote it.
 
However, here's what I do:
1) I create a list of the keywords I rank for in top10 from ahrefs and GSC
2) Put it in scrapebox, and bulk harvest the top5 results for all the keys.
3) Remove duplicate domains. When I start with a list of 1K keys, I usually end up with ~500-1000 domains.
4) Put it in ahrefs batch analysis. You can do only 200 domains at one time. So do this 3-5 times and copy the data in 1 excel.
5) Sort by DR. I keep everything that is <=DR30
6) Manually check all the DR<=30 domains. Last week when I did this, I got 250 domains that were <=DR30 from a list of 800.
Usually, 10-20% will be very usable (niche-related) domains, and 10-20% will be somewhat good.
The rest is crap, but it's part of the game.
Thanks mate, will actually give it a go!

To add to this - if someone's just starting out with a new site and aren't ranking for much, you can pull a competitor's keywords and get similar results.

What's your settings on Scrapebox & the number of proxies? It's been a long time since I last used it, and IIRC you need to go pretty slow so that the proxies don't get burned. Also if you could rec a good proxy vendor, IDK, every time I search for them the SERPs are filled with affiliate spam.
 
No proxy is needed.
I burn 10-15 VPN IPs for the harvest (I change IP after every ban)
Yeah, I know, I'm not a nice guy.
 
@Philip J. Fry I agree, nothing replaces the human eye. Too many factors for a tool to cover.

However, I personally have always wanted a tool that does exactly what OP is also looking for (automated). If you can utilize a trained VA, I agree that would be best though.

As far as automated methods, @janky way sounds like the best method so far, or Ahrefs content explorer.

Here's what I do:

1. Enter seed keyword into ahrefs, filter keywords below KD 20
2. Start manually searching each low difficulty KW, sorted by top volume
3. When searching for low KD keywords in serps copy all low DR domains you see into excel, that you see ranking in the 1-5 positions (you can easily find 10-20 in a short amount of time
4. Use site explorer to export top pages of each domain. Combine all into one csv file (open terminal, go to folder of all CSVs, and type: cat *.csv >merged.csv (this will merge all csvs into one csv file)
5. Now you have a big list of top traffic pages from many low DR domains (sort by top traffic)
6. Now you have a list of a lot of pages (topics) that get traffic. Since they are low DR, this increases the odds that it is easier to rank for as well. Good way to get tons of low competition topics to target (its more about topics anyways these days rather than keywords IMO).

Of course this is just one vector of analysis (DR). You still have a big unknown and that's quality of the content and if it sufficiently matches the user intent of the given keyword/topic.
 
I second LowFruits.io. It's easily the best keyword research tool I've used, and they just keep adding useful new features.
 
I second LowFruits.io. It's easily the best keyword research tool I've used, and they just keep adding useful new features.
I signed up for this and bought their Black Friday sale for 500,000 credits at $150 (50% off).

Me and my KW research guy tested it out and we think that the tool is stupid. I read the help guides and the methodology is correct but the results we got for our niche were quite dumb.

First off, we ran "keyword" through LowFruits and Ahrefs. We received the same number of results from both SaaSes. This lead us to the conclusion that LowFruits uses the Ahrefs API for keyword scouring. Does LowFruits scrape the People Also Ask questions too? Yes it does but the volumes here are 10 or so queries a month which is "too low" for us.

One feature that I liked was the clustering feature, where they clustered keywords according to the top 10 results that appear. When keywords have similar SERPS, they're clustered together. Great! However, during use, about 10 to 20% of the clusters were dumb and made no sense. 90-80% were great and the implementation would be to make a content brief for that cluster to target all the LSI keywords that LowFruits grouped together. This was a good feature but IMO it didn't merit the cost of LowFruits as it took about 1,000 credits to get the results for one keyword. At 1/500th of $300, that is $.60 a run. That's actually quite expensive IMO. Anyone doing "enterprise level" SEO or SEO for many sites will run through that in a month or two.

Another downside is that it pulls PAA and all these other queries. After awhile, the report is like 3,000+ keywords, which is WAY too much noise.

IMO, I'll just stick to Ubersuggest from Neil Patel for $290 one time. LowFruits sounded useful but wasn't useful in practice. Their methodology is correct though but using it burns through credits sooo fast and produces too much noise. It's not worth the time, effort, and energy IMO when an Ubersuggest report is very consistent. I also dislike Ahrefs for the same reason, that it gives too much data.

... and if someone wants to buy my LowFruits account, PM me. I'll sell it for like $110 with all remaining credits... you'll get like $20 free and this is with the Black Friday deal that's no longer available.

Also, if you use the other LowFruits feature, to look up the keywords of a domain, you can just use https://domainsheeter.com from @secretagentdad . Much cheaper and lower purchase price.
 
I nabbed some Lowfruits credits to try out, too.

The most useful thing I've found to do with this tool, which I really like, is to work backwards.

Instead of trying to find keywords from scratch, I'll import a list of keywords that my site already ranks for.

I took all the keywords that I rank 1 for, imported them into Lowfruits, then I used the Competitors tab which gives me a list of sites I'm beating.

This allows you to see a list of domains you're already able to beat, how many different keywords you're outranking them for, a few basic metrics about each competitor, and which ad network they're using which gives you a rough idea of their traffic (If they're in Mediavine or Adthrive, for instance).

From there, you just need to put together a list of everything else the beatable domains are ranking for that you aren't targeting yet, and that will give you an infinitely better starting point for which KW's to target instead of using it as a traditional kw research tool.
 
Back