[closed] "Googlebot can't access your site" error message (2 posts)

  1. finallyanime
    Posted 3 years ago #

    <blockqGooglebot can't access your site

    Over the last 24 hours, Googlebot encountered 58 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 100.0%.

    I think there's problem because of google sitemap xml plugin...but the problem is, I can't actually SEE the robots.txt file in my directory. The address is apparently there...but I don't see it along with the sitemap file when I goto FTP.

    Has anyone else had these problems? It's really getting on my nerves :(

  2. Please do not bump, that's not permitted here.

    I can't actually SEE the robots.txt file in my directory.

    WordPress can generate a robots.txt file if you have fancy permalinks turned on. What's your site URL? You can see what's being generated via http://your-wordpress-url/robots.txt

    Edit: Also please do not create duplicate topics. It makes it harder for volunteers such as yourself to provide support.


    Continue on your other topic.

Topic Closed

This topic has been closed to new replies.

About this Topic