Site Structure: How Deep Is Too Deep?

Joined
Jul 16, 2019
Messages
80
Likes
64
Degree
0
Trying to figure out the optimal structure for a very large site has got my head in a spin. In this scenario, it isn't possible and wouldn't make sense to drop the Animals category.

Let's take the following, for example:

Home > Animals > Dogs > Breeds
Home > Animals > Dogs > Health
Home > Animals > Dogs > Training
Post URL structure is domain.com/post

Homepage: Links to all category pages as well as a handful of the most recent and/or most important posts from each major category
Animals: Links to all the subcategories (Dogs, Cats, Fish) and shows a handful of the most recent and/or most important posts from each
Dogs: Links to all subcategories (Breeds, Health, Training) as well as all the posts within the Dogs subcategory
Breeds: Links to all posts within the Breeds subcategory

So far, so good, I think. But then there's the question of categorization, breadcrumbs and related posts.

First, for categorization, would it be okay to add a post to the Animals, Dogs and Breeds categories/subcategories? How would you then handle meta robots tags?
For breadcrumbs, I'm thinking it might be best to ignore Home > Animals > and limit it to just Dogs > Breeds.
For Related Posts, I think it would be best to display posts just from Breeds.

Would this work or would you run into any issues? Is there a better way to do things?

I've noticed a lot of large sites go down the Home > Animals > Dogs route and then just have a mishmash of posts about breeds, health, training. This is undoubtedly simpler but arguably isn't great for UX. Provided that you interlink well, whether it makes a difference with SEO, I'm not sure.

An alternative could be to use a static, handpicked list of Related Posts, or at the very least use tags for Breeds, Health, Training for the Related Posts section. It would partly improve the UX at least.
 
My question would be "too deep for who?" Google? Users?

For Google, if you have a sitemap submitted to Webmaster Tools, nothing is too deep. If you have tons of pages indexed and interlinked, nothing is too deep. Every post is an entry point for Google.

And the same goes for users. Most will enter your site to the exact page they're looking for through Google. Then they'll move on from there (if they don't bounce), through internal links and related posts and all that.

First, for categorization, would it be okay to add a post to the Animals, Dogs and Breeds

For categorization, and I'm assuming you're using Wordpress, if you add a post to a sub-sub-category (Breeds), it will "roll" through the sub-category above it (Dogs) and the parent category at the very top (Animals). There's no need to tick the checkbox for all of them. The sub-most category puts it in all of the above, in terms of the Wordpress Loop.

For breadcrumbs, I'm thinking it might be best to ignore Home > Animals > and limit it to just Dogs > Breeds.

I wouldn't mess with the breadcrumbs. The only thing I change is the final Post Title that displays usually, and I make it say "Here" for brevity.

I've noticed a lot of large sites go down the Home > Animals > Dogs route and then just have a mishmash of posts about breeds, health, training. This is undoubtedly simpler but arguably isn't great for UX. Provided that you interlink well, whether it makes a difference with SEO, I'm not sure.

Breaking it up as deeply as you want is fine, because you're not even displaying it in the URL anyways with domain.com/post-slug/. If the site is going to be gigantic, and it sounds like it will, you need to be doing what makes sense on the backend for you and your team for organization, searching, etc. If you do that right, it'll be right for the users.

An alternative could be to use a static, handpicked list of Related Posts, or at the very least use tags for Breeds, Health, Training for the Related Posts section. It would partly improve the UX at least.

I'm trying a static, hand-selected list of 5 on a newer site of mine, simply because, since they don't change all the time, Google won't see different related posts on each crawl and constantly be reassigning page rank flows and all that. I don't think it's important, but I did want to try it so that I would have more static interlinking going on.
 
For Google, if you have a sitemap submitted to Webmaster Tools, nothing is too deep. If you have tons of pages indexed and interlinked, nothing is too deep. Every post is an entry point for Google.

Sums up how I feel about the 'for google' part

I still love static html sites because of this. Google reads them super fast. No problem with bugs / needs for updates. And they rank really well.

And from my experience, the deep content structure has always been a positive(+)
 
Back