Support » Fixing WordPress » Google can't/won't index site – Robots.txt issue – or is it?

  • My site is

    Up until December 4th, 2012, my blog content regularly showed up in Google search results.

    As of December 5th (based on Google Analytics) it seems that something happened, from that date forward, inbound google search queries went to zero. Of course it’s possible that people simply stopped searching (and finding) my content. That said, I get a respectable amount of consistent inbound traffic (for me) and it’s very odd to see it throttle to zero literally overnight.

    After doing some research, it seems that my robots.txt file is preventing google to “see” my site. Based on some other topics I’ve read here, it’s related to Robots.txt – and that this is not uncommon.

    What is odd is that I’ve don’t recall ever making changes to the Robots.txt (I’ve left everything default). I suspect this is somehow plugin related. Which further leads me to believe that on December 4th/5th, I updated one of my plugins, and somehow they modified the robots.txt file. Yoast SEO or TinyMCE are my two suspects right now.

    I believe that TinyMCE is the culprit, based on what webmaster tools tells me when I try and submit a sitemap:

    Sitemap contains urls which are blocked by robots.txt

    I’ve since removed TinyMCE completely (deactivated and deleted the plugin) and yet this issue still persists. This file does not exist anywhere, yet somehow the sitemap somehow still includes a reference.

    I’m at a loss – can you help/point me in different direction? What information do you feel I should be including here to provide more color/context?

    Clearly I don’t know enough to fix this.

    Many thanks in advance.

Viewing 4 replies - 1 through 4 (of 4 total)
  • Moderator Mark Ratledge


    Forum Moderator

    TinyMCE won’t change robots.txt; Yoast can.

    Robots.txt files can impact indexing, but if your inbound traffic has dropped, check your indexing:

    In other words, you’re not indexed at all.

    Right now, this is your robots.txt file, blocking anything in wp-includes, which it should be doing:

    User-agent: *
    Disallow: /wp-admin/
    Disallow: /wp-includes/

    Thanks for the clarification –

    If this is the case, what would you prescribe? And any clue why this would happen so suddenly?

    Moderator Mark Ratledge


    Forum Moderator

    Use Google’s Webmaster’s Tools to find out.

    The trick is I have already (to the best of my ability). It’s diagnosis seems to indicate nothing is wrong.

    I get 200 (Success) with no URLs blocked via robots.txt.

    It certainly doesn’t indicate what the root cause of my non-indexing issue is.

Viewing 4 replies - 1 through 4 (of 4 total)
  • The topic ‘Google can't/won't index site – Robots.txt issue – or is it?’ is closed to new replies.