• Just checked out Webmaster Tools and it says that Googlebot can’t access the site, when I click for details it says ‘Over the last 24 hours, Googlebot encountered 1 errors while attempting to access your robots.txt. To ensure that we didn’t crawl any pages listed in that file, we postponed our crawl. Your site’s overall robots.txt error rate is 100.0%.’

    I have no idea what that means.

    The solution is

    If the site error rate is 100%:

    1.Using a web browser, attempt to access url/robots.txt If you are able to access it from your browser, then your site may be configured to deny access to googlebot. Check the configuration of your firewall and site to ensure that you are not denying access to googlebot.

    2.If your robots.txt is a static page, verify that your web service has proper permissions to access the file.

    If your robots.txt is dynamically generated, verify that the scripts that generate the robots.txt are properly configured and have permission to run. Check the logs for your website to see if your scripts are failing, and if so attempt to diagnose the cause of the failure.

    Anyone have any idea what this all means and how I can fix it? Or is it not even much of an issue

Viewing 1 replies (of 1 total)
  • It’s only a big deal if you want Google to index your site. You may not care about that. (I certainly don’t about some of my sites.)

    I would FTP to your site and take a look at your robots.txt file. This site will explain what that is all about: http://www.robotstxt.org/

Viewing 1 replies (of 1 total)
  • The topic ‘Googlebot can't access the site’ is closed to new replies.