<blockqGooglebot can't access your site
Over the last 24 hours, Googlebot encountered 58 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 100.0%.
I think there's problem because of google sitemap xml plugin...but the problem is, I can't actually SEE the robots.txt file in my directory. The address is apparently there...but I don't see it along with the sitemap file when I goto FTP.
Has anyone else had these problems? It's really getting on my nerves :(