I have domain mapping, and in my network, as expected, if I put a file in
robots.txt, i can access that file via any of the sites:
the issue is apparent that site
B need different site maps (as the sites have nothing in common)..
so I was thinking to solve this, create some sort of rules in htaccess:
then either way rewrite the URL
sitemap.xml for bots
can anyone help me out on how to accomplish this or if it's even possible?