So I have enabled multi-site successfully on my blog, forced it to use sub-directories and created 2 new sites. I had a robots.txt for the main blog, and was using the robots meta plugin to manage it.
I network activated the robots meta plugin and it resulted in the creation of the same robots.txt file under the sub-directory (or the site). I am not sure if this is the right thing to do. I think there should be only one robots.txt located at main-site/robots.txt, instead of separate robots.txt files like main-site/sub-site/robots.txt ..am I right? If yes, should I go ahead and deactivate the robots meta plugin for the sub-site and remove the other robots.txt it created?