(SEO GOOGLE) NextJS | Pages not indexed for over 6 months despite continuous fixes

Joined
Mar 25, 2026
Messages
1
Likes
1
Degree
0
Hello everyone,

we are experiencing a persistent indexing issue with our website and would greatly appreciate any guidance from the community.
For over 6 months, Google has only indexed our two localized homepages (/it, /en) , while all other pages remain excluded. During this time, we have continuously implemented fixes and improvements based on Search Console feedback and best practices, but none of these changes have resulted in successful validation or indexing.

Some of our pages currently reach up to 97% performance scores, and we have carefully reviewed:
  • Technical SEO (status codes, canonical tags, sitemap, robots.txt)
  • Core Web Vitals and performance metrics
  • Structured data (where applicable)
  • Internal linking and crawlability

Despite these efforts, the pages are still not being indexed, and Search Console does not provide actionable errors that explain the situation clearly.
We would love to hear from anyone who has faced similar issues:
  • What are the most common causes for pages remaining unindexed for such a long period, even when no critical issues are reported?
  • Are there specific signals or requirements that might prevent Google from indexing these pages?
  • Are there additional checks, tools, or strategies you recommend to diagnose or resolve this kind of issue?

We are actively maintaining and improving the site, but without clearer feedback it is difficult to determine the next steps.

I'll send you all the necessary links privately so you can help me.

Thank you in advance for any advice or insights you can share.

Best regards
 
How old is your domain and how old is your website? Example a domain registered in 2010 but has not been active until 8 months ago as a brand new site is an 8 month old site, versus 16 year domain.

Let's start with the foundation first.
 
Also, have you double and triple checked you're not generating no-index directives in the .htaccess or equivalent and not in the HTML headers? And have you checked the the robots.txt for blocking of crawling? It seems you've been diligent but worth another check.

I'd also check Google Search Console for any weird stray (or sabotage) things where it was requested to drop entire sub-folders out of the viewable index.

Have you checked in Search Console to see if Google is even crawling the pages? You should be able to see googlebot hitting them and downloading resources and all of that if they are.
 
Back