Rank trackers and their impact on search volume

Potatoe

BuSo Pro
Joined
Jan 4, 2016
Messages
736
Likes
1,115
Degree
3
If a keyword gets 1000 monthly searches and the next month, 100 people on a hundred different rank trackers start checking it every day... does Google distinguish those rank trackers from actual users, or does the search volume become 4000? I imagine this could be tested by using a made-up phrase and inputting it into some rank trackers and seeing if that influences the monthly search volume... I'm wondering if anyone has any observations on this.

Also a second question. Let's use SW as an example because I have it open in another tab. If 100,000 SerpWoo users are tracking the keyword "Builder Society", would a rank tracker (I guess different trackers might work differently...) just do one query and apply it to all of their users (In this case, my answer to the first Q is that tracking the ranks wouldn't have a huge influence on search volume since it's only adding +1 queries per [however often it updates] for each separate tracker rather than for each user of each tracker...) I would imagine it working similarly to this for a tracker like SW that monitors the entire SERP for a keyword, but for the older style ones that only track one page's rank for one KW at a time... maybe they're doing separate search queries for each user to determine their rankings? Im which case, I'm guessing something like SW leaves a much smaller footprint on the overall search volume for a KW than an older-style rank tracker... (Disclaimer: I really don't know how these apps work, so I'm taking stabs in the dark here and thinking out loud...)

The reason I'm curious about this is because I'm tracking my CTRs in my SERPs and trying to improve them. I know the monthly search volume numbers provided by Google are already kind of muddy and not exact, but I'm trying to figure out roughly how muddy they are. If 90% of searches for a particular keyword are just people trying to track the rank, I can assume my CTR among actual users is much higher than is actually being displayed, right? And looking at long tail keywords that don't get huge volume, if even a handful of people are tracking them... that could potentially count for a huge chunk of the search volume, depending on how it all works... Which is why I made this thread. Hey there!
 
I don't have a definitive answer, but this is something that's crossed my mind. I assumed that Google does not filter out these searches versus "real" searches.

Also, I'd say that while it's interesting to know the REAL value of the CTR, I don't think you'll get it, but thankfully a relative value will still let you measure change, which is what we're really trying to affect.
 
Also, I'd say that while it's interesting to know the REAL value of the CTR, I don't think you'll get it, but thankfully a relative value will still let you measure change, which is what we're really trying to affect.

Very good point! Would I need to track the changes in monthly search volume as well as changes to my CTR to get a more accurate representation, or does that not really matter because I'm looking at a ratio anyways? My mind is full of fuck already because I'm not great with statistics lol.
 
1. I can, without a doubt, guarantee Google does not count automated systems, rank trackers included. You can see this simply by querying that made up keyword and even though some rank trackers pull it 2 to 24 times a day, for 365 days straight that keyword’s volume will stay at 0 after a year. I’d rather not discuss exactly how I know this, but I can give a hint by telling you to look at Google’s Keyword Planner. They’ve dropped the keyword volume data and switched to amount of cookie sessions that keyword gets queried by. “Most” rank trackers do not store cookies. There is a whole lot more hints and clues to how Google is able to figure out what’s a real visitor and who’s a bot, but again I’ve written about this in the past but would rather not unearth some of the methods.

2. Rank tracker pull the top 100. Some pull every page, up to the top 100 or whatever they claim; So that is 10 queries in that case, or 1 in the ones that pull only top 100. The thing is every tracker HAS to pull all the data needed for the top 100 cause recall they did a Google query, It’s just only SERPWoo stores all the top 100, whereas the other trackers will simply look for the domains for all the users monitoring that keyword then simply does not store the other SERPs, there is no reason to for them.

SEMRush and Spyfu and a two other majors store all the SERP data to an extent, but don’t really have the same display SW has.
 
Yep, not that it's needed, but I can confirm what CC says above. That's not to say that competitive terms aren't skewed a bit by manual rank checking from seo's/clients and other little tools at times. But this is Google, they have that stuff pretty well accounted for - things like repeated searches from same user etc.
 
Search volume's not that accurate either way. I can show you a site that's #2 for a 2,900 vol keyword that receives 800 hits a day (mostly to the page that targets the 2,900 vol keyword). So, yeah, take it with a grain of salt.
 
Search volume's not that accurate either way. I can show you a site that's #2 for a 2,900 vol keyword that receives 800 hits a day (mostly to the page that targets the 2,900 vol keyword). So, yeah, take it with a grain of salt.

Agree, but that could be because of shitloads of longtails, not inaccurate data?
 
Not to devalue your question, but one of the more important things I've learned from working with a partner more experienced than I am is to not trust search volume estimates from Google....ever.

I'm sure you can inch out better conversions / more revenue by optimization. It's just been my experience that going after new / more long-tail keywords usually turns into more revenue, faster.
 
1. I can, without a doubt, guarantee Google does not count automated systems, rank trackers included. You can see this simply by querying that made up keyword and even though some rank trackers pull it 2 to 24 times a day, for 365 days straight that keyword’s volume will stay at 0 after a year. I’d rather not discuss exactly how I know this, but I can give a hint by telling you to look at Google’s Keyword Planner. They’ve dropped the keyword volume data and switched to amount of cookie sessions that keyword gets queried by. “Most” rank trackers do not store cookies. There is a whole lot more hints and clues to how Google is able to figure out what’s a real visitor and who’s a bot, but again I’ve written about this in the past but would rather not unearth some of the methods.
.

Really sure about that? I've been doing some tests which I've discussed in the VIP section of the GSA forum and automated queries do get picked up by Search Console, Analytics and Keyword Planner. Trends didn't pick it up, but I guess the query volume was too low. Autocomplete didn't pick it up either unfortunately.

Not going to openly discuss the program I used, but the proxies were both private, residential and public and I noticed no difference in pick-up rates. That being said, while I tested both branded terms, exact terms and a combination and CTR rates increased massively in Search Console rankings didn't budge at all.

Make from it what you will, but either CTR isn't a ranking factor (just as Mueller and Illyes have been claiming) or it's way too hard to manipulate and Google simply 'stealth' allows Keyword Planner / SC to pick them up to prevent reverse-engineering their actual spam detection algorithm.
 
Really sure about that? I've been doing some tests which I've discussed in the VIP section of the GSA forum and automated queries do get picked up by Search Console, Analytics and Keyword Planner. Trends didn't pick it up, but I guess the query volume was too low. Autocomplete didn't pick it up either unfortunately.
I may have a slight advantage due to owning a rank tracker that pulls millions of keywords daily for our users, most with zero volume, and that volume continues to state zero even after 2+ years of pulling the rankings 2-3 times a day. I also know other owners of rank trackers that pull the same amount of daily queries and more and the volumes for these keywords never increases in ANY keyword tool, but especially Google.

Google's switching to cookie sessions as an indicator for volume for a reason too.

Make from it what you will, but either CTR isn't a ranking factor (just as Mueller and Illyes have been claiming) or it's way too hard to manipulate and Google simply 'stealth' allows Keyword Planner / SC to pick them up to prevent reverse-engineering their actual spam detection algorithm.

You might want to look into PandaBot and why they've been so successful, since your statement contradicts their existence. As well we've been doing a case study at SERPWoo which we'll be releasing soon that goes a bit more in-depth regarding CTR and how they effect rankings. It's not as simple as just clicking on your SERP link. You have to click on the link when there is a bump/test/boost happening to that ranking... :wink:

https://twitter.com/recalibrate/status/596034771148414977

CEWKdz_Us_AAowl_O_png_large.png


CEWKnel_VIAAo_BSj_png_large.png


More details coming soon...
 
I may have a slight advantage due to owning a rank tracker that pulls millions of keywords daily for our users, most with zero volume, and that volume continues to state zero even after 2+ years of pulling the rankings 2-3 times a day. I also know other owners of rank trackers that pull the same amount of daily queries and more and the volumes for these keywords never increases in ANY keyword tool, but especially Google.

Google's switching to cookie sessions as an indicator for volume for a reason too.
You might be right re: cookie sessions and rank trackers (never checked and I frankly don't care), but that's not what you said. You claimed automated systems do not work and that's simply not true.

Have a look at these pieces of Keyword Planner data from some of my automated experiments:

l8vCsdM.png

TTVsO4t.png

(last one only used public proxies BTW)

Too bad Search Console only goes back 4 months or I could show the impressive click-through rates as well.


You might want to look into PandaBot and why they've been so successful, since your statement contradicts their existence.
Come on now. I might as well look into Fiverr gigs to rank my sites since their mere existence is ultimate proof they work?!

Me and others have run very extensive tests with Pandabot, Serpify, Crowdsearch.me (which claims to use human searches) and Insane Google Ranker and they do jack shit to improve your rankings. Some guy on the GSA forum even coughed up 100's for residential UK IP's to use with IGN and still nothing!

Not only that, but I've spammed classified ad websites as well to gain thousands of real human queries and guess what? Still no results!

And with all due respect for these graphs, but they just seem like normal ranking fluctuations. Just look at the volatilty spikes on times when CTR testing wasn't initiated. Same patterns.

That being said, I'm always open to adjust my opinion when ample evidence is demonstrated, so definitely looking forward to your case study. :smile:
 
Last edited:
but that's not what you said.
I think you confused yourself about the question OP asked and my answer to mean something else. Look at the title of the thread, the question again, and then my answer.

Regarding CTR, just because YOU and YOUR group of associates have not been able to do something doesn't mean others have not. Maybe your way of doing it simply did not work cause you missed/didn't consider some variable(s). I dunno, I can't comment on/refute vague top secret test from unknown sources with unknown skills. I don't know what methods you used to test or for even how long you did your tests and at what volumes. You could've tried for 5 mins, 5 days, 5 weeks, or 5 months.

I can only comment on test I've done and have been a part of.

If I can run a $10 server and rudimentary detect when a user is using a VPN or proxy to connect to my SAASes, a simple thing even Twitter and companies like Hostgator can too, I am going to put my money on the probability Google can too. That is probably ONE of the multiple variables you might have not considered.
 
I think you confused yourself about the question OP asked and my answer to mean something else. Look at the title of the thread, the question again, and then my answer.
I did and I stand by my interpretation. You said automated systems (rank trackers) included don't get counted and it's obvious the tools I mentioned above do get counted.

Regarding CTR, just because YOU and YOUR group of associates have not been able to do something doesn't mean others have not. Maybe your way of doing it simply did not work cause you missed/didn't consider some variable(s). I dunno, I can't comment on/refute vague top secret test from unknown sources with unknown skills. I don't know what methods you used to test or for even how long you did your tests and at what volumes. You could've tried for 5 mins, 5 days, 5 weeks, or 5 months.

If I can run a $10 server and rudimentary detect when a user is using a VPN or proxy to connect to my SAASes, a simple thing even Twitter and companies like Hostgator can too, I am going to put my money on the probability Google can too. That is probably ONE of the multiple variables you might have not considered.

Come on now, most of it is mentioned in my previous posts and even thousands of real human searches in a two month period didn't pan out. I've been quite meticulous with my tests, used dozens of variables and considered many possible options, but it just didn't pan out.

The fact that you mentioned Pandabot and used their 'existence' as proof it works, probably says enough regarding your own testing. I'm done with this discussion.
 
I did and I stand by my interpretation. You said automated systems (rank trackers) included don't get counted and it's obvious the tools I mentioned above do get counted.

It's actually not obvious cause you want to keep your tests in the dark, so I will continue to stand by everything I state.
rank trackers (never checked and I frankly don't care)
^^ The title and subject of the thread is literally right there, so I can only assume if you "didn't care about rank trackers" and then while looking at the title of the thread "rank trackers and their impact on search volume" - I assume you are confused as to what we are talking about. I believe you are throwing in GSA SER in the same category as rank tracking automation, I personally wouldn't put that in the same category but that's most likely where your confusion came from.

The fact that you mentioned Pandabot and used their 'existence' as proof it works, probably says enough regarding your own testing. I'm done with this discussion.

EXSkNNC.gif


But to be serious, time's too short to waste it talking about vague GSA SER tests. I just don't have time for these arguments throwing inferences around which lead to vague conjectures. If you don't like my conclusions bring actual data supporting your argument. I can't make an argument against vagueness. Arguing on a forum with a random person on the internet about semantics and definitions is just not at the top of my agenda anymore.

Good luck mates, I'll see you on the other side of the veil.
 
Not trying to seem like I'm adding fuel to a fire because I didn't see this convo as negative really, just a discussion. Although, doing the "I'm taking my ball and going home" is stupid.

I'm siding with @rogerke just because I myself also use/created bots that do what he's saying (search and click sites on Google), but that's only if I'm understanding @CCarter right.

So, just to clear up, CCarter are you saying tools like rank trackers--ones that are from the same IP's--are not affecting search volume? A bot that affects auto-suggest is different than what you're saying?

It's kind of a pointless question but a good one (the OP's). I have wondered myself if a tool I used that changes Google's auto-search affects the search volume in keyword tools. It's always hard to be 100% sure when you see your brand's name in Google Keyword Tool and wondered if the bot affected that search volume. My gut says if you're fooling their algo one way (again, auto-suggest example) it's probably reflecting in all their tools.
 
So, just to clear up, CCarter are you saying tools like rank trackers--ones that are from the same IP's--are not affecting search volume?
Rank Trackers do not affect search volume.
 
For anyone still interested in CTR manipulation, Rand Fishkin did a case study (link: https://moz.com/rand/queries-clicks-influence-googles-results/) and a BHW member just started a Q&A thread talking about his CTR manipulations he's done in in the last 20 months.

Using proxies? DOESN'T WORK
Using fake clicks? DOESN'T WORK
Using real people? WORKS

link: http://www.blackhatworld.com/blackh...-ive-learned-about-manipulating-serp-ctr.html

I assumed it was universal knowledge that CTR is a ranking factor, but seeing this discussion it seems there are people still in the dark. Moz and blackhats have done tests leading to the same conclusion that CTR is a ranking factor.

Not even humans affect the search volume...
That's puzzling cause I also didn't understand what @rogerke meant by that.
 
Yep, can confirm, CTR doesn't have an impact on rankings... nothin' to see here folks... move along...

usedcarsalesman.jpg


If you get a decent CTR and don't rank higher it's because CTR doesn't impact rankings, it's definitely not because there are any other factors at play. Speaking of which... social doesn't impact rankings either, so don't even bother with that nonsense.

Sarcasm aside...

Me and others have run very extensive tests with Pandabot, Serpify, Crowdsearch.me (which claims to use human searches) and Insane Google Ranker

I know you can't really believe that those services are going to behave like actual people tho, right? So that point is moot. Again it's like buying up a bunch of fake likes then saying social doesn't impact the serps. Even if they're using actual people, that's FAR from realistic browsing behaviour.

And what type of keyword were you using for your classified test? Was it a brand name? That could behave differently than other keywords would.

I know you said you're done with this discussion, but I figure an IM forum is a good enough place as any to discuss IM.

Ultimately, we're all trying to get to the bottom of this so we can use it to our advantage, it's not about anybody winning an argument, it's about sharing information so we can all find out exactly what's what.

So hopefully my questions can spur some more discussion on this topic. The reason I ask is because I've had a page ranking for a ridiculous keyword and bringing in tens of thousands of organic searches a day with zero backlinks or SEO considerations, aside from doing semi-well on social and having a title that people literally couldn't not click, compared to everything else that was around it in the SERPs.

If CTR didn't at least help it to stay there after social got it to the party, I can only attribute it to magic. I'd rather not believe in magic, if I can help it.
 
Last edited:
There's a reason for CTR manipulation being one of the highest volume and most frequent tasks requested on Microworkers.
 
Well guys, I was 'done' witht this discussion because it didn't seem to go anywhere and turned into a semantics / trust argument. But now it's reopened I feel confident to step in again and have some nice arguments.

@juliantrueflynn

I also used a lot of brand (and exact match variations) queries as well. Mainly because of this and this patent and although these queries got picked up (altered search volume in Analytics, Search Console and Keyword Planner, but not in autocomplete), it didn't improve rankings. This was after the last Panda update though, so if these patents are indeed part of the Panda algorithm it needs a new refresh to show results.

@Ryuzaki
They without a doubt do. Wether the exact volume statistics are right though is a whole different story and I agree Keyword Planner's data seems to be waaay off sometimes.

And there doesn't actually have to be a valid reason why people use these Microworkers gigs. Deception and ignorance are all too common and the main reason why the dietary supplement industry is a multi-billion dollar industry. Really, these ipse dixit / testimonial arguments have no place in discussions like these.

@j a m e s
Well, it's easy to state it's obvious these tools behave differently compared to real users, without doing any testing you'll never find the holes in Google's algorithm.

Guys, to me the question isn't wether these tools get picked up by GA/SC/KW. They clearly do wether you use residential IP's, private proxies or even public proxies. The question to me is wether a) CTR is a ranking factor at all - and if so- b) why the hell GA/SC/KP pick up these fake queries but rankings don't improve. As said before one of my hypothesis for b) is that Google allows these tools to pick the queries up but doesn't actually use them in their ranking algorithm to prevent reverse-engineering their actual spam detection algorithm.

One more thing. Open up the CTR data of any of your websites in Search Console and look for (relatively to the ranking position) high CTR keywords. I bet you'll notice a lot of them have been having high CTR for months but rankings didn't improve at all. Shouldn't be happening if CTR was really a ranking factor, right?
 
What if the goal was to fuck with the volumes in high value long tail heavy industries to promote shitty keywords and hide stuff with real value on shitty public keyword tools.....

What if the goal was to supply adwords advertisers with some sort of "useful" data to help them spend money and not seo's at all?
 
Think about what you wrote.

A keyword in niche associated with high traffic value (and therefor cpcs) builds up a bit of volume.

Few advertisers are targeting it. Who's interest is it in to get that keyword on their customers radar.....
 
I see your point, but I'm not sure that most of their customers (If I understand you correctly) are that savvy.

Maybe they are encouraging people to start at a macro level and narrow their focus using the data as it comes in. I can't speak for G, I can only hypothesize.

I was just trying to point out that seo's are complaining about a tool that wasn't built for them.

edit: I probably missed your point entirely.
 
Back