I have a problem with Google not being able to access my robots.txt.
This is the message i get when testing my submitted sitemap:
Network unreachable: robots.txt unreachable. We were unable to crawl your Sitemap because we found a robots.txt file at the root of your site but were unable to download it. Please ensure that it is accessible or remove it completely.
Ok.. http://cheapautoinsuranceinfo.net My site is just a crappy little test site..but I need to resolve this issue so I understand it for any other time it might occur to an important site.
I have google xml sitemap installed... set with option to create virtual robots.txt turned OFF.
wordpress settings privacy set to allow search engines.
permalinks set to "post name"
I have a robots.txt file in my root domain
This file seems fine to me, it opens perfectly.
websniffer.net shows no problems.
I can not see any thing wrong with .htaccess file either...
YET Google still can not access it.
Another weird thing I noticed is that the robots.txt file will open like normal no matter how many slashes are in the url.. for example: http://cheapautoinsuranceinfo.net/////robots.txt opens fine.. surely that is supposed to result in a 404...
Can anybody please help?