• When logged into GSC, I noticed my site coverage report showed a major dip in valid pages. So I went to investigate my sitemap settings. How can I tell if my sitemap is active?

    Under SEO Framework Settings > Sitemap Settings > Robots.txt I noticed it says this.

    Add sitemap location to robots.txt?
    This only works when the sitemap is active.

    Something on the General sitemap settings also made me question if it was active. See below where is says to view the base sitemap, not The sitemap can be found here (with link)
    Output sitemap?
    View the base sitemap.

    I look forward to hearing from you!

Viewing 5 replies - 1 through 5 (of 5 total)
  • Did you try to open your sitemap? And submit the new link?

    There is a link to it in the settings, but its /sitemap.xml. You can open it and see if all the pages are there.

    Plugin Author Sybre Waaijer


    Hello šŸ™‚

    Thanks for filling in LaMpiR! Much appreciated!

    TSF prevents indexing of many redundant pages, such as feed links, as well as duplicated content pages. With the recent addition of our “Advanced Query Protection”, accidentally generated faux-archive pages are blocked out, too. This can explain the major dip of “valid” pages.

    I removed the “The sitemap can be found here” sentence in TSF v4.0. Since then, we started calling it the “Base Sitemap”, because we have more sitemap types on our roadmap. One of which has already been released, the Google News sitemap via our Articles extension.

    If that “Base sitemap” link leads to any valid sitemap, then it’s working as intended. You can add it to GSC, via this page: https://www.google.com/webmasters/tools/sitemap-list (select your property at the top left).

    I hope this explains the lot! Cheers šŸ™‚

    Thread Starter downtowncasa7


    Thanks to both of you! Yes, that does help. But my concern is that those pages are not on the site map. I have a total of 27 blog posts (not duplicate) and my sitemap coverage dropped from 39 to 15.

    I resubmitted the sitemap and now it’s at 18. I guess I’m just concerned the ranking for those pages now labeled as not valid will drop. But perhaps you can enlighten me?

    Thread Starter downtowncasa7


    I should correct that statement actually. They are all on the sitemap, but why would the coverage drop then?

    Plugin Author Sybre Waaijer



    If you go to “Google Search Console > Sitemaps (left) > Hit the /sitemap.xml row (middle) > See Index Coverage > Excluded”, you can see the Details at the bottom, like in this image:

    Google Search Console Sitemap Details

    In the Details, Google explains why the URLs are excluded.

    Now, Google uses your “intent” to see if a page should be indexed; that way, they combat duplicated content autonomously. This “intent” is derived from backlinks or internal links. So, when the details say “Crawled/Discovered – Currently not indexed”, the URLs are likely new and Google hasn’t seen backlinks or internal links to those pages yet.

    There might be other reasons listed in those details, which you should be able to resolve with adjusting TSF’s settings.

    Lastly, and this is a common oversight, if you click on any details row, you can see which pages are affected.

    I hope this helps as well! Cheers šŸ™‚

Viewing 5 replies - 1 through 5 (of 5 total)
  • The topic ‘Sitemap not Active’ is closed to new replies.