Support » Plugin: Google XML Sitemaps » Sitemap URLs blocked by robots.txt file

  • mpoepping30


    I have a new site my company manages that is seeing 4 of these index errors inside of Google Search Console for all parts of its sitemap (the listed link, and 3 others that contain sitemap details), and I can’t figure out how to alleviate them.

    We are using the Google XML Sitemap Plugin and have the setting “Add sitemap URL to the virtual robots.txt file” selected in its settings.

    I’m wondering if this setting is causing these index errors to occur in our sitemaps, or if it’s something else.

    I tried changing our robots.txt file to allow everything from all bots, but that didn’t seem to solve these issues (I did that this morning and then resubmitted the Sitemap through Search Console, but the 4 sitemap index errors still remain).

    I even tried removing the robots.txt file from this site’s directory entirely, and these 4 errors still remain.

    Do you think this setting is causing the issue, is it the virtual sitemap causing issues (and how do I find and change that if so), or is it something else entirely?

    I’d like to get this fixed ASAP to avoid any further penalties from Google so any guidance would be greatly appreciated, thanks!

    The page I need help with: [log in to see the link]

Viewing 3 replies - 1 through 3 (of 3 total)
  • mpoepping30, have you found a solution yet? I see there has been no response here. I have sought support for this same problem from other sites and it seems no one is capable of assisting.

    @ecokleensolar, I did not find an actual solution anywhere, and seem to have the same troubles you do with finding any help in getting an answer.

    After really combing through my client’s Search Console and site audit (from AHRefs) I did end up fixing any and all errors I could however. From there I saw the sitemap issues cleared after resubmitting it to Search Console.

    I don’t necessarily think that’s a direct correlation, as I really think the virtual sitemap just finally updated based on my robots.txt changes (see below for what I landed on trying for this), or I think removing a Schema Markup plugin I had installed on the site (which I also did to try and fix this issue) may have also helped, but I’ll never know for sure.

    I think that plugin (I can’t remember the name of the one we were using) was creating the virtual sitemap, and Yoast may have been as well, which could’ve also caused the bot crawl confusion.

    Sorry I can’t be more help, but hopefully this gives you some ideas on what to try/where to look!

    Sample Robots.txt file:

    User-Agent: *
    Allow: /
    Allow: /wp-content/uploads/
    Disallow: /wp-content/plugins/
    Disallow: /readme.html

    @mpoepping30, thanks for your detailed reply. I didn’t have a plugin causing an issue so what I did was ‘fetch and render’ + ‘request indexing’ for all pages through our Google console. It is a slow approach and I had to do every page individually (+ Google restrict you to only 10 pages per session). Within a matter of minutes after indexing the issue was sorted for each page.

Viewing 3 replies - 1 through 3 (of 3 total)
  • You must be logged in to reply to this topic.