So I have enabled multi-site successfully on my blog, forced it to use sub-directories and created 2 new sites. I had a robots.txt for the main blog, and was using the robots meta plugin to manage it.
I network activated the robots meta plugin and it resulted in the creation of the same robots.txt file under the sub-directory (or the site). I am not sure if this is the right thing to do. I think there should be only one robots.txt located at main-site/robots.txt, instead of separate robots.txt files like main-site/sub-site/robots.txt ..am I right? If yes, should I go ahead and deactivate the robots meta plugin for the sub-site and remove the other robots.txt it created?
Well, I am not sure if it created another robots.txt file because when I went to blogs.dir->sub-directory->files… I couldn’t see any robots.txt. But when I point my browser to main-site/sub-site/robots.txt, it does show it up (it’s actually the same one as it’s for the main site)