• I had an email from Webmaster tools today saying:

    Googlebot cannot access my site. There had been 36 errors trying to access my robot.txt file and so Google have postponed their crawl. Your site’s overall robot.txt error rate is 53%.

    I tried using fetch tool to access robots.txt and it said ‘temporarily unreachable’. When I try and access the robots.txt file in the browser I get…

    User-agent: *
    Disallow: /wp-admin/
    Disallow: /wp-includes/

    I looked on the server and couldn’t find a robots.txt file. I know that I have at least a couple of pages noindexed so there must be an error. I read that the file shown in the browser is the standard file that WP creates when there is no robots.txt.

    I also run W3 Total Cache and have read that this causes lots of problems with robot.txt. I have disabled all caching for now but this doesn’t appear to have made a difference according to the fetch tool. I’ve seen some old posts about this but they’re outdated and discussing older version of W3TC so could really do with some help!

  • The topic ‘W3 Total Cache and robots.txt’ is closed to new replies.