Google Search Console - New Coverage Report - Extremely Useful

Ryuzaki

お前はもう死んでいる
Moderator
BuSo Pro
Digital Strategist
Joined
Sep 3, 2014
Messages
6,127
Likes
12,747
Degree
9
I'm posting this as a public service announcement: the new Coverage report is amazingly useful.

I just explained in depth why I'm even using this thing here if you're interested in my journal. I'll use the same screen shots so you can see the thing, and explain how it's helpful to me and you too.

When you navigate to the Coverage dashboard, you'll be greeted with a graph with four options across the top you can click on. They are:
  • Error
  • Valid with Warnings
  • Valid
  • Excluded
You can have all 4 loaded in the chart or just one or a mixture.

So here's what my Error panel looked like:

GtUwnO0.png


I had a "Soft 404" error because I had a special 404 page in my sitemap (didn't realize it). So they'd try to index it but would get a 404... obvious problem. I'd never have noticed if not for this new feature. There are a ton of error types that can show, but you should deal with these immediately if you have any.

Here's my Valid panel:

87pfVPs.png


This, as far as I can tell, has 2 types in it: submitted in sitemap and indexed, and indexed but not in sitemap. Some of this can be problematic so take a look through the "indexed but not in sitemap." This is typically stuff like PDF files, pagination pages like /page/4/, etc. In my case I found I had a problem in one of my loops and you could type /page/99/ and get an empty page with a header, footer, and ads. Google has about 50 of these indexed for me that I need to get removed now that I have them properly going to a 404 page. The reason is that I decided to show more posts per pagination page, meaning a lot of old ones now were still live and not going to 404.

Then there's the Valid With Warnings panel:

EdSni3R.png


Look at that nightmare. I think I finally found out why my site hasn't been able to grow. I have over 500 affiliate links indexed with no titles, meta descriptions, or page content. They're all set to be nofollow and noindex but Google didn't know that because I blocked them from crawling it in robots.txt before I got it all set up properly then didn't remove it. So imagine 75% of my indexation is literal trash and pagination pages with nothing but ads. Makes sense that my sitewide quality score is suffering no matter how good my 275 - 300 posts are.

There's also an Excluded panel. You don't need a screenshot, but it has stuff like:
  • Page with redirect
  • Blocked by robots.txt
  • Alternate page with proper canonical tag
  • Excluded by noindex tag
  • Not found (404)
  • Crawled - currently not indexed
  • Crawl anomaly
And more. Stuff that's fine but you should review it all and make sure you have no surprises lingering.

The new Coverage report is seriously the bee's knees. It might straight up revolutionize my site by helping me spot and fix indexation issues. It can help you do the same or at least confirm that you don't have any problems. Make sure to go check that out in the new Search Console.
 
It has been a pretty nice improvement.

I'm still waiting for and hoping that they'll be migrating the crawl rate report. Yeah, there's always server logs but, I confess I'm lazy and I've always liked being able to take a quick glance and make sure they're crawling smoothly. ;-)
 
It always amazes me how few people take advantage of Search Console. Bing's Webmaster Tools are actually pretty dope too, even if you give zero fucks about Bing's search engine.
 
Back