When I was testing a site I checked the "Ask search engines not to index this site." option. However, after unchecking this, Google is still showing a message saying that the site can't be indexed. It has been a couple of weeks now.
If someone could explain how WordPress is setting the robots.txt, I may be able to fix this. I assumed that it just added a robots.txt file to the root, but it doesn't. I've added my own robots.txt file with "User-agent: *", and I can see this file when I browse to mysite.com/robots.txt. I assume that WordPress is overriding this, but cannot find out where and a Google search has come up with no answers.
It seems as if WordPress is generating this file on the fly somehow, and unchecking the option, is not reversing this.