You don’t have to have your robots.txt have your sitemap, you know.
I know, but I have read that any “dings” on your site are negative for your page SEO ranking. And since the first thing that google robot looks for on your site is robots.txt, and if does not find it, it is a possible “ding” on your site.
I have also read that it does not matter if you have robots.txt or not. But I would rather be on a safe side and not incure any “dings” in order to keep my site ranking high.
Especially since robots.txt can’t hurt the side but only help it by having sitemap listed there:
User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/
Sitemap: http://primary-domain.com/sitemapindex.xml
For the reasons above, I would love to have robots.txt for all my domains.
So does anybody know how to enable robots.txt for mapped domains?
I know, but I have read that any “dings” on your site are negative for your page SEO ranking. And since the first thing that google robot looks for on your site is robots.txt, and if does not find it, it is a possible “ding” on your site.
Not entirely correct.
Yes, dings are bad.
No, not having the sitemap is a ding.
Actually, Google, by default, looks at sitemap.xml.
Read http://support.google.com/webmasters/bin/answer.py?hl=en&answer=156184
You really do not need a sitemap, unless you have a lot of pages that aren’t linked to. Which you probably don’t. Heck, most of us never need it at all. Why? Cause WordPress is awesome and links things correctly out of the box. Mostly.
Having a Sitemap will not increase your ranking in Google’s search results pages—but it will not reduce it, either.
The fewer-plugin way to do this is as follows:
1) Create a Google Webmaster account
2) Set it up on your site
3) Submit the sitemap
4) Have a beer 🙂
The fewer-plugin way to do this is as follows:
1) Create a Google Webmaster account
2) Set it up on your site
3) Submit the sitemap
4) Have a beer 🙂
Thank you for the information, but I have almost 40 domains on my multisite install, that is why I was hoping to enable robots.txt on all my mapped domains to have it done automatically.
So going back to my original question, is it possible to enable robots.txt for mapped domains?
Looks like I have found my answer, I can only have robots.txt for each mapped domain only if I use sub-domain. If I use sub-folder, I am out of luck. Shoot, this sucks.
http://betterwp.net/wordpress-plugins/google-xml-sitemaps/#toc-robots