WordPress.org

Ready to get started?Download WordPress

Forums

WordPress SEO by Yoast
[resolved] Sitemap Blocked By Robots.txt (18 posts)

  1. cadolphin
    Member
    Posted 2 years ago #

    WordPress version: 3.3.2
    WordPress SEO version: 1.1.9

    I did this: Activated XML Sitemap and made sure privacy setting in WP is set to "Allow"

    I expected the plugin to do this: To create sitemap and ping search engines

    Instead it did this: It created sitemap and pinged search engines but got the following message from Google: Sitemap contains urls which are blocked by robots.txt.

    How do I resolve this issue?

    http://wordpress.org/extend/plugins/wordpress-seo/

  2. Joost de Valk
    Member
    Plugin Author

    Posted 2 years ago #

    Unblock the URLs that are blocked by robots.txt? Or remove those URLs from your XML sitemap using the advanced tab of those pages.

  3. cadolphin
    Member
    Posted 2 years ago #

    Sorry, I'm a newbie to all of this. The URL that is blocked is my site's homepage. I don't know where to even begin to unblock the URL. I've gone through every plug-in to see if any others have privacy settings I might have missed. I tried looking through the root directory and can't even begin to figure that out...

  4. Joost de Valk
    Member
    Plugin Author

    Posted 2 years ago #

    If you don't give me a URL to your site I can't help you...

  5. cadolphin
    Member
    Posted 2 years ago #

  6. cadolphin
    Member
    Posted 2 years ago #

  7. Joost de Valk
    Member
    Plugin Author

    Posted 2 years ago #

    That sitemap is not generated by my plugin and, in fact, doesn't exist anymore. So that's why GWT might be throwing errors...

  8. cadolphin
    Member
    Posted 2 years ago #

    I'm now totally confused. I originally had a different plug-in (Google XML Sitemap) installed and used it to create the first sitemap before I pinged the search engines. When I got the message from Google, I looked all the privacy settings and found your plug in also created a sitemap. I uninstalled the original plug in and used your plug in to create a new sitemap. I then re-pinged Google.

    If the sitemap listed isn't from your plug in, how was it generated? Is it some how from the first plug in; which I installed a few months ago?

    Do you have any suggestions on how I can fix this?

    Thank you for your patience in helping me through this.

  9. Joost de Valk
    Member
    Plugin Author

    Posted 2 years ago #

    You can just delete that file, that plugin generated a static XML sitemap, whereas mine isn't a static file.

  10. cadolphin
    Member
    Posted 2 years ago #

    Thanks, Joost. I'm sorry if I seem slow but this is all so new to me. How do I go about doing that? Where can I find the file to delete it? My assumption is that once I delete that old file, I re-generate a new sitemap using your plug-in. Is that correct?

    Again, thank you.

  11. Joost de Valk
    Member
    Plugin Author

    Posted 2 years ago #

    Log in to your server with FTP and delete the sitemap.xml in the root of your install, usually that's the public_html directory.

    You don't have to regenerate a file using my plugin, it's already there, as you can see. You might have to add it to Google Webmaster Tools though.

  12. cadolphin
    Member
    Posted 2 years ago #

    Thanks, Joost. I really appreciate you helping me through this.

  13. Joost de Valk
    Member
    Plugin Author

    Posted 2 years ago #

    My pleasure :)

  14. pates
    Member
    Posted 2 years ago #

    I'm receiving the same error in WMT. It's even saying that the warning is for this value (as well as others)

    Sitemap: /page-sitemap.xml
    Value: http://www.domain.co.uk/

    Slightly worrying but the /robots.txt is just blocking the standard /wp-admin and /wp-includes folders. Little confused here as the sitemaps that have been submitted are also 100% clean.

    I appreciate I haven't given much clarity on this.

  15. splimwad
    Member
    Posted 2 years ago #

    Hi,
    I am also a little confused. My site is
    http://stopsmokinghypnosis1.com

    My robots file looks like this:
    User-agent: *
    Disallow: /wp-admin/
    Disallow: /wp-includes/

    It is also saying that google tried to crawl but were blocked by the robots.txt file?

    I submitted my sitemap_index.xml to webmaster tools and it comes back with errors?
    However if I type in
    site:stopsmokinghypnosis1.com
    It returns lots of indexed pages?

    Any help would be awesome.
    Regards
    Chris

  16. wdlb
    Member
    Posted 2 years ago #

    Hi

    I have the same problem. I even uploaded my own robots.txt file on the server and it still says on webmaster tools, that some url are blocked by that file, inter alia the /post-sitemap.xml and category-sitemap.xml.

    Please help

  17. wdlb
    Member
    Posted 2 years ago #

    Hi, I have fixed my problem. Well, it turned out, that for some reason the header of my theme had noindex and nofollow instructions for the robots in a meta tag. So check your meta tags, because the solution might be there. Hope that helps!

  18. Joost de Valk
    Member
    Plugin Author

    Posted 2 years ago #

    guys, open a new thread if you have an issue. This one is closed.

Topic Closed

This topic has been closed to new replies.

About this Plugin

About this Topic