Support » Fixing WordPress » "Robots.txt inaccessible", shows up under "Blocked URLs" in Webmaster Tools

  • So I did get a Crawl Error notification stating that Robots.txt was inaccessible and that the crawl would be postponed.

    Looking under “Blocked URLs” in Webmaster Tools, I see listed that Robots.txt was successfully accessed 16 hours ago with no URLs blocked, yielding the following content:

    User-agent: *
    Disallow: /wp-admin/
    Disallow: /wp-includes/

    I started investigating this when I realised that my recent posts were not being indexed. I assumed they were not being indexed because the crawl was being postponed.

    Is this accurate? If so, why would Webmaster Tools show a proper Robots.txt? It seems in conflict with the Crawl Error.

    Thanks guys. For your reference, my website is

Viewing 2 replies - 1 through 2 (of 2 total)
  • Okay, so an updated sitemap appears to have been approved…it shows that some pages have indeed been indexed.

    However, I still see the same Crawl Error message pertaining to Robots.txt. The pages do not seem to be showing up on Google unless I enter site:URL…I am guessing there is still something wrong.

    ANY ideas?

    Side note: I have tried to correct some of the issues that might have been causing the crawl errors (DNS, Server Connectivity, Robots.txt Fetch). I can’t tell if the error reports that are showing up are from previous issues with my site, or if they are recurring now.

    Is there any input about what might make things run more smoothly? I can’t tell if this is simple or way “out there”.

Viewing 2 replies - 1 through 2 (of 2 total)
  • The topic ‘"Robots.txt inaccessible", shows up under "Blocked URLs" in Webmaster Tools’ is closed to new replies.