• I found this message in google webmaters dashboard “Over the last 24 hours, Googlebot encountered 47 errors while attempting to access your robots.txt. To ensure that we didn’t crawl any pages listed in that file, we postponed our crawl. Your site’s overall robots.txt error rate is 14.6%. ” why? And how to solve this issue. My website robots.txt code is :

    # global
    User-agent: *
    Disallow: /cgi-bin/
    Disallow: /wp-admin/
    Disallow: /wp-includes/
    Disallow: /wp-content/plugins/
    Disallow: /wp-content/cache/
    Disallow: /wp-content/themes/
    Disallow: /trackback/
    Disallow: /comments/
    Disallow: */trackback/
    Disallow: */comments/
    Disallow: wp-login.php
    Disallow: wp-signup.php

    url:http://question2answers.com/robots.txt

Viewing 2 replies - 1 through 2 (of 2 total)
  • why don’t you change it to the simple line like below:

    User-agent: *
    Allow: /

    and make sure that everybody has the permissions to read the file.

    Hello!

    Google allows to access your “Crawl Report”, in which you can be shown precisely which are the URLs not working, and which are the pages pointing to them.

    I suggest you browse that report to find where is the problem, if there is one 🙂

    (it always annoys me they’re listing as problematic external links pointing to pages deleted long ago on our blog, how should we help it, sigh.)

Viewing 2 replies - 1 through 2 (of 2 total)

The topic ‘Googlebot can't access your site’ is closed to new replies.