After setting up different websites and struggling to get them indexed by Google, I have learnt a thing or two about the most common errors in Google Search Console when it comes to indexing. So today I share with you the most common indexing issues in Google Search Console and how to solve them.
Google Search Console helps website owners monitor and troubleshoot various indexing issues that can affect the visibility of their pages in Google search results.
If you are struggling with getting your site indexed by Google and want to make the most of Google Search Console, check out these articles:
Common Indexing issues in Google Search Console and how to solve them.
How long does it take for Google Search Console to index?
How To Get A Page Indexed Faster With The Inspection Tool In Google Search Console
How to use Google Search Console for SEO: The Ultimate Checklist.
How to use Google Search Console for Keyword Research
Google Analytics versus Google Search Console for SEO: How to Maximise Your Rankings
Here are some common indexing issues and their potential solutions:
The 8 reasons why your site is not indexed by Google.
1. Coverage Issues:
Error: “Submitted URL not found (404)”:
This error indicates that the submitted URL is not accessible or returns a 404 error. To resolve this, check the URL for typos, ensure proper website configuration, and fix any broken links or missing pages.
Error: “Submitted URL seems to be a soft 404”:
Google perceives the submitted URL as a soft 404 error, which means it may not provide useful content. Review the page content and make sure it provides relevant information to users. If the page should be indexed, consider updating its content to match user expectations.
Error: “Indexed, though blocked by robots.txt”:
This means that the page is blocked by the website’s robots.txt file. Verify the file to ensure the page is not unintentionally blocked. Adjust the robots.txt rules if necessary to allow crawling and indexing of important pages.
Error: “Indexed, not submitted in sitemap”:
The page is indexed, but it is not included in the submitted sitemap. Check the sitemap configuration and ensure the page is properly included. Consider regenerating the sitemap if needed.
2. Mobile Usability Issues:
With the emphasis on mobile-first indexing, mobile usability issues can impact how your site is indexed and displayed on mobile devices. To solve this issue, you should:
Use the Mobile Usability report in Google Search Console to identify specific issues affecting your mobile site.
Fix mobile usability issues, such as mobile-unfriendly design, unplayable videos, or faulty redirects.
Error: “Viewport not configured”: The page does not have a properly configured viewport meta tag. Ensure the viewport is correctly set to accommodate various devices and screen sizes.
Error: “Text too small to read”: The text on the page is too small to be legible on mobile devices. Adjust the font size and optimise the page layout for mobile viewing.
Error: “Clickable elements too close together”: Interactive elements (buttons, links) are positioned too closely, making them difficult to tap accurately on mobile screens. Increase the spacing between clickable elements to improve user experience.
3. Structured Data Issues:
Error: “Either ‘offers’, ‘review’, or ‘aggregateRating’ should be specified”: This error occurs when required structured data properties are missing. Add the necessary properties according to the structured data type, such as price for product pages or rating for review pages.
Error: “Missing field ‘name'”: The ‘name’ field is missing in the structured data. Include the ‘name’ field with the appropriate value to accurately describe the entity.
4. Crawl Errors:
These errors occur when Googlebot encounters problems accessing and crawling your webpages. They can be caused by server errors, broken links, or pages blocked by robots.txt.
There is one specific issue, that I covered entirely in the article How to fix the “Crawled — Currently Not Indexed” and “Discovered – Currently not indexed” issue in Google Search Console that is very common when setting up a completely new website. Check it out if that is the case.
To solve this issue, you should:
- Check for server errors and ensure your website is accessible.
- Fix broken links and ensure all internal links are working correctly.
- Review your robots.txt file to ensure it’s not blocking important pages.
Server Errors (5xx):
These errors occur when Googlebot encounters a server-related problem while trying to crawl your webpages. Common server errors include the 500 Internal Server Error and 503 Service Unavailable. To solve this issue, you should:
- Check your server logs for any technical issues or errors.
- Ensure your server is properly configured and can handle the crawl load from search engines.
- If the server error is temporary, wait for the server to be back online and accessible to Googlebot.
Soft 404 Errors:
Soft 404 errors happen when a page returns a “Page Not Found” status code (404) but actually shows a different content, typically a generic error page. To solve this issue, you should:
- Properly configure your server to return the correct 404 status code when a page is not found.
- Customise your error pages to provide helpful information to users and guide them to relevant content.
Not Found (404):
This error indicates that the submitted URL was not found, meaning the page does not exist on your website. To solve this issue, you should:
- Review your website’s internal linking structure and update or fix any broken links pointing to non-existent pages.
- If the page has been permanently removed, consider setting up a 301 redirect to redirect users and search engines to a relevant page.
Access Denied (4xx):
These errors occur when Googlebot is denied access to crawl a specific page due to permission restrictions. The most common error is the 403 Forbidden status code. To solve this issue, you should:
- Check your website’s security settings, including file permissions and access rules.
- Ensure that pages you want to be indexed are not unintentionally blocked by a robots.txt file or other access restrictions.
- Review and update your website’s permissions to allow search engine crawlers access to relevant pages.
5. Sitemap Issues:
If your sitemap is not properly configured or submitted to Google Search Console, it can lead to indexing issues. To solve this issue, you should:
Generate a sitemap for your website and ensure it includes all relevant pages.
Submit your sitemap to Google Search Console to help Google discover and index your pages.
Regularly update and resubmit your sitemap whenever you make significant changes to your site.
6. URL Parameters:
Submitted URL Blocked by robots.txt: This error indicates that the URL you submitted to Google for indexing is blocked by your robots.txt file. To solve this issue, you should:
Review your robots.txt file and verify if the URL is intentionally blocked.
If the URL should be indexed, update your robots.txt file to allow access to that specific URL or remove the blocking directive altogether.
Submitted URL Marked ‘noindex’: This error occurs when the page you submitted for indexing contains a ‘noindex’ directive in its HTML code or HTTP headers. To solve this issue, you should:
- Check your website’s code or CMS settings to ensure that pages you want to be indexed do not have the ‘noindex’ directive.
- Remove the ‘noindex’ directive from relevant pages, allowing search engines to index them.
In addition to these, URL parameters can create duplicate content issues and confuse search engines. To solve this issue, you should:
- Use the URL Parameters tool in Google Search Console to configure how Googlebot should handle different URL parameters.
- Set canonical tags to consolidate duplicate content and indicate the preferred version of a page.
7. Blocked Resources:
If certain resources on your website, such as CSS or JavaScript files, are blocked from being crawled by search engines, it can hinder proper indexing. To solve this issue, you should:
Review your robots.txt file to ensure important resources are not blocked.
Use the Fetch as Google tool in Google Search Console to check if resources are accessible to Googlebot.
Optimise your website’s rendering and ensure important content is visible without the need for JavaScript.
8. Duplicate Content:
Having identical or very similar content across multiple pages can confuse search engines and dilute the visibility of your website. To solve this issue, you should:
Identify and consolidate duplicate content by using canonical tags to point to the preferred version of a page.
Improve your site’s information architecture and internal linking structure to reduce duplicate content issues.
It’s important to regularly monitor Google Search Console for any indexing issues and promptly address them to ensure optimal visibility and performance in Google search results. Remember to consult official Google documentation and guidelines for detailed instructions on troubleshooting and resolving specific issues.
For me, it is absolutely essential to regularly monitor the Index Coverage report in Google Search Console to identify and address any indexing errors promptly. You know, when you get these GSC email notifications, don’t ignore them, try to log in and see if there is anything that requires your attention.
By resolving these errors, you can ensure that your website is effectively crawled and indexed by Google, leading to better visibility in search results.
Here is a video from Google Search Console that explains the indexing process in more detail within the search process:
I hope this was useful. See you soon!
Missing me already, dear human? You can find me on X and Facebook.
Moxie