Support » Fixing WordPress » robots.txt blocking Google

  • Thea


    I just created a new website, and when I added it to Webmaster tools, it tells me the site is “denied by robots.txt”.

    I am using the same theme as one of my other sites with the exact same plugins on the same settings. I can see the virtual robots.txt file, here it is:

    It looks exactly the same as all my other websites, i.e.:
    User-agent: *

    However for some reason, Google Webmaster Tools claims it says:

    User-agent: *
    Disallow: /.

    I’ve tried installing the KB plugin and rewriting the file, but it makes no difference. Does anybody have any ideas what else I could look at?

    By the way, I have set the site to “private” now, because a lot of the content is duplicated on another site so I can’t afford to have them both visible at once. However I did check to make sure I had the site set on public when I was testing!

  • The topic ‘robots.txt blocking Google’ is closed to new replies.