Support » Plugin: Google XML Sitemaps » [Plugin: Google XML Sitemaps] URL blocked by robots.txt warning msg in Webmaster tools

Viewing 12 replies - 1 through 12 (of 12 total)
  • I am having the same problem.

    “Error URL restricted by robots.txt”

    under site configuration->crawler access this is what it says

    User-agent: *
    Disallow: /
    
    Sitemap: http://cdrecords.net46.net/sitemap.xml.gz
    Plugin Contributor Arne Brachhold

    (@arnee)

    Yes, you are blocking all search engines from crawling your blog. Check the WordPress privacy settings:

    http://codex.wordpress.org/Settings_Privacy_Screen

    my privacy settings are set to “Allow search engines to index this site.” still the same problem.

    Plugin Contributor Arne Brachhold

    (@arnee)

    Now your robots.txt looks fine, wait 24h before Google downloads it again.

    oh ok, Google updated its results. I had to make the robots.txt manually.

    Mine is now fine too! My privacy settings were correct, and the robots.txt file existed, so I’ve no idea what was happenning, but it seems miraculously to have fixed itself!
    Thanks for the feedback.

    User-agent: *
    Disallow: /wp-admin/
    Disallow: /wp-includes/

    Sitemap: http://entreparentesis.com/sitemap.xml.gz

    this is what my robots.txt says. It is blocking google to crawl?

    Same problem here but not resoled.

    No one has an answer to this?

    How is this listed as resolved when there is no answer posted? Some of us are still having issues.

    It seems to be some sort of bugs. I had my website in development and I had the option “Discourage search engines from indexing this site” ticked. When I decided to get my site indexed by Google, surprise, surprise, it told me that robots.txt was blocking the access to the site, even though I knew I unchecked the “Discourage search engines from indexing this site” option and I had the right robots.txt configuration. After that I clicked “Save Changes” several times, I’ve reset the configuration for XML Sitemaps and completely deleted robots.txt. It immediately worked afterwards.

    I’ve got the same problem with WP. It seem’s that WP doesn’t change all settings correctly if I changed the privacy from blocked to public. I solved the problem with a real robots.txt in the document root. This real file has a higher priority. Since I made this everything works fine.

Viewing 12 replies - 1 through 12 (of 12 total)
  • The topic ‘[Plugin: Google XML Sitemaps] URL blocked by robots.txt warning msg in Webmaster tools’ is closed to new replies.