Hello!
A link to the site in question would be most helpful, so we can quickly spot the issue.
For future reference, when you create a new topic, you can submit a link that is hidden from logged-out users.
When it comes to the robots tags not working, these are the only three possible causes I am aware of:
- An object caching plugin may be active. It can cause database calls to serve stale information. See if flushing its caches helps.
- A page caching plugin may be active. It can cause old versions of pages to be outputted. Again, see if flushing its caches helps.
- WordPress discourages search engines from indexing the site. Here, TSF disables its robots metatag. The SEO Bar will exclaim this issue (quite harshly) in red.
TSF works independently from other integrations in outputting its metatags. So I doubt custom debugging code will ease investigating this issue.
Any more information you think is useful on this issue will be appreciated! Cheers 🙂
Thanks for the reply.
I’ve found the problem.
Somehow, the value of the blog_public option was set to 2 (instead of 0 for private, 1 for public).
In the is_blog_public() method in core.class.php, you are checking for a value of 1 otherwise you assume the blog is private. I’m wondering if it would be better to test against 0 instead, and return true unless the option value is 0?
Anwyay, toggling the Discourage search engines from indexing this site setting does set the option value back to 0 and 1, which fixes the issue – thanks for the clue!
Heh, what I find more curious is how did it even get to 2!?
WordPress sporadically checks for (bool) true/false, (int) 1, (string) '1', and (string) '0' in its source, without ever type checking. So, I’m not sure what the right path is–something else would break anyway (in your case, the option wasn’t checked?).

With that said, we could make our check weaker, which should conform to WordPress’ non-standard 🙂 Thanks for the suggestion!
I’m glad you’ve found the issue 🙂 Cheers!