Disallow search engine indexing of mapped sites?
It would be great if the plugin took advantage of the ‘robots_txt’ filter to disallow indexing of sites that have mapped subdomains, so that Google doesn’t index both the original site and the mapped URL as duplicates. I can’t see a case where someone would want to allow both to be indexed. Using the ‘robots_txt’ filter would still allow the rules to be edited/overruled by other plugins, etc.
- The topic ‘Disallow search engine indexing of mapped sites?’ is closed to new replies.