Indexing multiple sites at once

Joined
Dec 14, 2017
Messages
42
Likes
10
Degree
0
What do you use to index multiple pages at once? I've used commenting services to index web2.0 properties with success, but paying for each property gets expensive as hell. Any reliable methods to get this done in bulk?

I'm open to suggestions to specific services or to paid tools (with reasonable pricing).
 
If you have a Google account, you can submit the URLs for indexing right in the SERPs by searching anything like "submit url to google." A box will appear you can use. But then again, if it's spammy stuff I wouldn't recommend doing it. But I've been using it for places when Ahrefs finds a forums links to my site that Google hasn't found or indexed yet. It gets it indexed within hours, as long as it's not trash. But I've gotten them to index some trash this way too.
 
There are a lot of indexing softwares out there that will send your list of URLs to a million ping services that Google supposedly crawls.

Another method would be to shoot social shares out at all of them and then spam the social profiles.

The method back in the day, when you didn't want to do bulk spam was to use high PR dofollow blog comments. The high PR guaranteed a high crawl rate. Like you said, this is costly these days even for copy and paste comments, and it's still spam and not what you should want.

Another method would be to create one web 2.0 that acts as an HTML sitemap that links to all the others and get it crawled, but of course then you're tying them all together.

If you want to stay away from spam and ping services, then your web 2.0's need to be good enough to warrant indexing and good enough to get some decent links pointed to them so they can be found. And that's not really worth doing in bulk either.
 
What kind of link count are you after? Theres a service called Index Inject that basically submits URLs directly to Google for you using their own submit form but it can get a bit pricy as it required captchas.
 
Back