So I have enabled multi-site successfully on my blog, forced it to use sub-directories and created 2 new sites. I had a robots.txt for the main blog, and was using the robots meta plugin to manage it.
I network activated the robots meta plugin and it resulted in the creation of the same robots.txt file under the sub-directory (or the site). I am not sure if this is the right thing to do. I think there should be only one robots.txt located at main-site/robots.txt, instead of separate robots.txt files like main-site/sub-site/robots.txt ..am I right? If yes, should I go ahead and deactivate the robots meta plugin for the sub-site and remove the other robots.txt it created?
I network activated the robots meta plugin and it resulted in the creation of the same robots.txt file under the sub-directory (or the site).
You mean it made a subfolder and spawned the robots.txt file?
I dunno, I think you could just handle it via one robots.txt in the main root of your install (same level as wp-config.php basically).
Can you link to the plugin you’re using?
Well, I am not sure if it created another robots.txt file because when I went to blogs.dir->sub-directory->files… I couldn’t see any robots.txt. But when I point my browser to main-site/sub-site/robots.txt, it does show it up (it’s actually the same one as it’s for the main site)
Oh okay 🙂 It’s a virtual file. That’s fine.
It doesn’t do any harm, I don’t think.
- The topic ‘Robots.txt in Multi-site’ is closed to new replies.