Support » Plugin: Google XML Sitemaps » "Googlebot can't access your site"…? Sitemaps fault?

  • Googlebot can’t access your site

    Over the last 24 hours, Googlebot encountered 58 errors while attempting to access your robots.txt. To ensure that we didn’t crawl any pages listed in that file, we postponed our crawl. Your site’s overall robots.txt error rate is 100.0%.

    Further it says…

    Using a web browser, attempt to access If you are able to access it from your browser, then your site may be configured to deny access to googlebot. Check the configuration of your firewall and site to ensure that you are not denying access to googlebot.
    If your robots.txt is a static page, verify that your web service has proper permissions to access the file.
    If your robots.txt is dynamically generated, verify that the scripts that generate the robots.txt are properly configured and have permission to run. Check the logs for your website to see if your scripts are failing, and if so attempt to diagnose the cause of the failure.

    I looked in my root and I cant see the robot or sitemap file…? How do I view them if I can’t see them to set permissions…? I’m just a bit confused and this is really annoying me -_-”

  • The topic ‘"Googlebot can't access your site"…? Sitemaps fault?’ is closed to new replies.