Support » Plugin: XML Sitemap Generator for Google » [Plugin: Google XML Sitemaps] Robtos.txt blocking all of the pages in sitemap

Viewing 5 replies - 1 through 5 (of 5 total)
  • Thread Starter btzitney

    (@btzitney)

    My main issue is that Google Webmaster tools is saying all the pages on my site are being blocked by robots.txt, but, as you can see in my robots.txt file, none of them are blocked!

    http://www.thereisnoquit.com/robots.txt

    i Have the same problem. Please can somebody help?!

    Thread Starter btzitney

    (@btzitney)

    I partially fixed the problem. First, I made my own robots.txt file that didn’t have any pages besides /wp-admin/ and /wp-includes/ blocked.
    Then I deactivated the plugin and deleted it.
    After that, I reinstalled it and made sure to uncheck the box in the Google XML sitemap configuration page that specifies whether the plugin should make a virtual robots.txt file. After that, I submitted the sitemap through webmaster tools, and 6 of the 8 pages were indexed. Though I have no clue why the other two were not indexed, the home pages and most other pages are now indexed, at least.

    Home that helps.

    Thread Starter btzitney

    (@btzitney)

    I should note that I made sure to uncheck the virtual robots.txt option before I generated the map. Just wanted to make sure that was clear.

    The plugin doesn’t create the robots.txt file. This file is virtually created by WordPress. The plugin just hooks into this generated robots.txt to include the “Sitemap:…” line. It does not add any Allow/Disallow entries.

    If you don’t like the robots.txt generated by WordPress just upload an own (empty) one.

Viewing 5 replies - 1 through 5 (of 5 total)
  • The topic ‘[Plugin: Google XML Sitemaps] Robtos.txt blocking all of the pages in sitemap’ is closed to new replies.