WordPress.org

Support

Support » Plugins and Hacks » Multisite Robots.txt Manager » [Resolved] MS Robots.txt setup: Subdomain Install with Mapped Domains

[Resolved] MS Robots.txt setup: Subdomain Install with Mapped Domains

Viewing 6 replies - 1 through 6 (of 6 total)
  • Even though the site in question is running on a mapped domain, the plugin reports that it is not a mapped domain. Why is that?

    Plugin Author tribalNerd

    @tribalnerd

    Hello John,
    Did you by chance have any other robots.txt plugins installed before the ms robots.txt manager?

    In another thread, a user was using the plugin Kb Robots.txt. When the plugin was deleted it left the KB robots.txt file in the sites settings. To correct the issue the user edited the KB robots.txt file setting to be blank. Location: Network Admin > Sites > Edit Site > Settings

    Let me know if this solves your problem.

    ~Chris

    Hi Chris,

    I didn’t have any leftover robots.txt files. I’ve figured out that the problem is a conflict with the virtual robots.txt files generated by WordPress when a user (un)selects ‘Discourage search engines from indexing this blog’.

    One or two of my sites had either block/unblocked search engines. I set them all to allow search engines and the plugin now works, to a degree.

    I’ve got a system-wide robots.txt file configured through MS Robots.txt Manager and a different one for site 3. However, I want the system-wide file to display on the unmapped site 3 and the custom robots.txt to display on the mapped domain of site 3.

    Its currently the wrong way round.

    Chris,

    I’ve been playing around with this and I’m still stuck:

    (Root site) – live.service.domain.net/robots.txt
    # robots.txt
    User-agent: *
    Disallow: /

    I want this for all non-mapped domains

    (Site 3) – cctld.live.service.domain.net/robots.txt
    # robots.txt
    User-agent: *
    Disallow: /wp-content/plugins/

    I want this for the mapped domains only

    (Mapped Site 3) – http://www.britishcouncil.org.ng/robots.txt
    User-agent: *
    Disallow: /wp-admin/
    Disallow: /wp-includes/

    Generated by WordPress, not wanted at all

    Plugin Author tribalNerd

    @tribalnerd

    Hello again John,
    I’ll look into adding a warning if that feature is checked.

    With the next post, when I open: http://www.britishcouncil.org.ng/robots.txt

    I get the MS Robots.txt Managers – robots.txt file:
    # robots.txt
    User-agent: *
    Disallow: /wp-content/plugins/

    Is this what you want it to do?

    ….

    If you’re wanting the robots.txt files for non-mapped sites vs. mapped sites to be different, you should: Network update so all sites have the same robots.txt file, then for non-mapped sites select the site in the drop down and manually add the non-mapped robots.txt file, then click update this website.

    Let me know if this helped.

    ~Chris

    Hi Chris,

    I ended up uninstalling your plugin as it couldn’t seem to override the system-generated robots.txt files. Instead I wrote an mu-plugin and added this to it:

    /** Load custom robots.txt */
    function custom_robots($output) {
    	$output ="#robots.txt\nUser-agent: *\nDisallow: /";
    	$url = site_url();
    	if (!strpos($url,"%%%servername%%%.%%%domain%%%.net")) {
    		//Accept robots on all public domains
    		$output .= "wp-content/plugins/";
    	}
    	return $output;
    }
    add_filter('robots_txt','custom_robots');

    Anyone else struggling with this could just add that to your functions.php file within a theme. Just change the line:

    if (!strpos($url,"%%%servername%%%.%%%domain%%%.net")) {

Viewing 6 replies - 1 through 6 (of 6 total)
  • The topic ‘[Resolved] MS Robots.txt setup: Subdomain Install with Mapped Domains’ is closed to new replies.
Skip to toolbar