Unblock the URLs that are blocked by robots.txt? Or remove those URLs from your XML sitemap using the advanced tab of those pages.
Sorry, I’m a newbie to all of this. The URL that is blocked is my site’s homepage. I don’t know where to even begin to unblock the URL. I’ve gone through every plug-in to see if any others have privacy settings I might have missed. I tried looking through the root directory and can’t even begin to figure that out…
If you don’t give me a URL to your site I can’t help you…
That sitemap is not generated by my plugin and, in fact, doesn’t exist anymore. So that’s why GWT might be throwing errors…
I’m now totally confused. I originally had a different plug-in (Google XML Sitemap) installed and used it to create the first sitemap before I pinged the search engines. When I got the message from Google, I looked all the privacy settings and found your plug in also created a sitemap. I uninstalled the original plug in and used your plug in to create a new sitemap. I then re-pinged Google.
If the sitemap listed isn’t from your plug in, how was it generated? Is it some how from the first plug in; which I installed a few months ago?
Do you have any suggestions on how I can fix this?
Thank you for your patience in helping me through this.
You can just delete that file, that plugin generated a static XML sitemap, whereas mine isn’t a static file.
Thanks, Joost. I’m sorry if I seem slow but this is all so new to me. How do I go about doing that? Where can I find the file to delete it? My assumption is that once I delete that old file, I re-generate a new sitemap using your plug-in. Is that correct?
Again, thank you.
Log in to your server with FTP and delete the sitemap.xml in the root of your install, usually that’s the public_html directory.
You don’t have to regenerate a file using my plugin, it’s already there, as you can see. You might have to add it to Google Webmaster Tools though.
Thanks, Joost. I really appreciate you helping me through this.
I’m receiving the same error in WMT. It’s even saying that the warning is for this value (as well as others)
Sitemap: /page-sitemap.xml
Value: http://www.domain.co.uk/
Slightly worrying but the /robots.txt is just blocking the standard /wp-admin and /wp-includes folders. Little confused here as the sitemaps that have been submitted are also 100% clean.
I appreciate I haven’t given much clarity on this.
Hi,
I am also a little confused. My site is
http://stopsmokinghypnosis1.com
My robots file looks like this:
User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/
It is also saying that google tried to crawl but were blocked by the robots.txt file?
I submitted my sitemap_index.xml to webmaster tools and it comes back with errors?
However if I type in
site:stopsmokinghypnosis1.com
It returns lots of indexed pages?
Any help would be awesome.
Regards
Chris
Hi
I have the same problem. I even uploaded my own robots.txt file on the server and it still says on webmaster tools, that some url are blocked by that file, inter alia the /post-sitemap.xml and category-sitemap.xml.
Please help