Google can't/won't index site – Robots.txt issue – or is it?
My site is http://www.thesocialwhat.com.
Up until December 4th, 2012, my blog content regularly showed up in Google search results.
As of December 5th (based on Google Analytics) it seems that something happened, from that date forward, inbound google search queries went to zero. Of course it’s possible that people simply stopped searching (and finding) my content. That said, I get a respectable amount of consistent inbound traffic (for me) and it’s very odd to see it throttle to zero literally overnight.
After doing some research, it seems that my robots.txt file is preventing google to “see” my site. Based on some other topics I’ve read here, it’s related to Robots.txt – and that this is not uncommon.
What is odd is that I’ve don’t recall ever making changes to the Robots.txt (I’ve left everything default). I suspect this is somehow plugin related. Which further leads me to believe that on December 4th/5th, I updated one of my plugins, and somehow they modified the robots.txt file. Yoast SEO or TinyMCE are my two suspects right now.
I believe that TinyMCE is the culprit, based on what webmaster tools tells me when I try and submit a sitemap:
Sitemap contains urls which are blocked by robots.txt
I’ve since removed TinyMCE completely (deactivated and deleted the plugin) and yet this issue still persists. This file does not exist anywhere, yet somehow the sitemap somehow still includes a reference.
I’m at a loss – can you help/point me in different direction? What information do you feel I should be including here to provide more color/context?
Clearly I don’t know enough to fix this.
Many thanks in advance.
- The topic ‘Google can't/won't index site – Robots.txt issue – or is it?’ is closed to new replies.