Support » Plugin: Google XML Sitemaps » [Plugin: Google XML Sitemaps] Robots.txt File blocking Google from Crawling

  • Every since updating the plugin Google is not able to crawl my site, urls are blocked. All of my post pages seem to be blocked. Post pages that once appeared in Google Search are now replaced with home page (Google is able to send the home page due to pagination – picking up some of the words from within the post because of the link to the home page).

    I would really appreciate some assistance because I do not know how to begin to fix this. I have noticed others have had the same issue when searching in Google forums and the problems seems to stem from this section of the plugin settings:

    Add sitemap URL to the virtual robots.txt file.
    The virtual robots.txt generated by WordPress is used. A real robots.txt file must NOT exist in the blog directory!

    I’m not sure if the backend settings need to be adjusted or what needs to be done to correct this issue but I am concerned if I don’t get it fixed sooner than later I may lose my ranking. Others have stated once they fixed this issue everything returned to normal within a couple of weeks.

    Thank you!

Viewing 6 replies - 1 through 6 (of 6 total)
  • Moderator cubecolour



    what is the URL of your site?

    what setting do you have set for privacy?

    Allow search engines to index site.

    Also not sure if this applies but Google has been denied access to this url –

    My hosting company went and made some changes today but I’m not certain it worked because Google still doesn’t seem to be able to crawl (I think). I would like to make sure everything is as it should be. Everything went crazy after the update. Others have stated they placed the above url in the robots.txt and it seemed to kick everything back into shape but I really don’t understand all of this and I would like to know what the proper fix or what settings should be in place so that everything works properly.

    Thank you so much for your assistance.

    p.s. I was not able to manually fetch as Google Bot – error -failed finally after the hosting company went in and made some changes the changes finally allowed me to submit manually but the error continues to appear in Webmaster Tools “access denied”

    Just a little something I ran into myself once which may be of service. I had initially set my WP install to private during the build and toggled it off when I was ready to launch and submit to Search Engines but, like many of you, had warnings/errors indicating a robots.txt file was blocking access. Of course, there was no robots.txt file in my root folder – and no, WP was dynamically “generating” one. It turned out the culprit was a security plugin I was using (Bulletproof – although I imagine WP Security Scan and others might produce similar issues) which, when set in full security mode was adding some stringent deny code to my .htaccess file. I simply toggled off root .htaccess security (kept admin areas locked down) and, within minutes, bots were happily chomping away!

    Thought I’d offer this up as one more thing you could check! Cheers!

    Thank you so much zeke43. I don’t have a security plugin but have had issues with plugins that do make changes in the .htaccess file. I’m hoping cubecolour will come back with some info, must be a reason he’s asked the preliminary questions.

    Although as you’ve said, a security plugin probably isn’t the cause of the problem for – just as an FYI to others experiencing this problem, following zeke’s advice, I disabled WP Security Scan and this allowed Google to access my sitemap again

Viewing 6 replies - 1 through 6 (of 6 total)
  • The topic ‘[Plugin: Google XML Sitemaps] Robots.txt File blocking Google from Crawling’ is closed to new replies.