Index Coverage Data Improvements
Monday, January 11, 2021
Helping people understand how Google crawls and indexes their sites has been one of the main objectives of Search Console since itsearly days. When we launched thenew Search Console, we also introduced theIndex Coverage report, which shows the indexing state of URLs that Google has visited, or tried to visit, in your property.
Based on the feedback we got from the community, today we are rolling out significant improvements to this report so you’re better informed on issues that might prevent Google from crawling and indexing your pages. The change is focused on providing a more accurate state to existing issues, which should help you solve them more easily. The list of changes include:
- Removal of the generic "crawl anomaly" issue type - all crawls errors should now be mapped to an issue with a finer resolution.
- Pages that were submitted but blocked by robots.txt and got indexed are now reported as "indexed but blocked" (warning) instead of "submitted but blocked" (error)
- Addition of a new issue: "indexed without content" (warning)
Soft 404reporting is now more accurate
The changes above are now reflected in the index coverage report so you may see new types of issues or changes in counts of issues. We hope that this change will help you better understand howwe crawl and index your site.
Please share your feedback about the report through theSearch Central Help Community or viaTwitter.
Posted by Tal Yadid, Software Engineer, Search Console
Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.