I have domain mapping, and in my network, as expected, if I put a file in
robots.txt, i can access that file via any of the sites:
the issue is apparent that site
Bneed different site maps (as the sites have nothing in common)..
so I was thinking to solve this, create some sort of rules in htaccess:
then either way rewrite the URL
can anyone help me out on how to accomplish this or if it’s even possible?
- The topic ‘Network unique sitemap.xml's for each site’ is closed to new replies.