I have searched the net, and as yet have not found an answer to this problem.
This may end up being rather technical, so input from experienced PHP/WPRESS users would be very much appreciated.
I am aware that it is commonly held on forums that GOOGLE should only look at directives from the rood robots.txt, however if this were the case I would not get the issues in webmaaster tools that I am seeing about restricted URL's..
I have wordpress installed in a folder of http://www.dating-review-uk.co.uk(/blog)
I have already written my own robots.txt file that is in my root, however noticed that a "virtual" robots is also generated here
Initially this was set to "disallow" when I browsed for it directly, even though I had correctly set my privacy setting in the WP admin area? To counter this I installed the KB robots plugin and the configured the robots.txt through that and all seemed well.
However Google webmaster tools reported that several URL's were restricted, so I looked into it and found that the NON WWW. verson of my site was still returning the "disallow" virtual robots.txt for the W-P folder.
I have re configured my .htaccess to solve conical issues (redirect ALL non WWW to the WWW. version) however this virtual robots is still served.
No I have considered adding another ,htaccess to the folder that WP is in to try to specifically redirect that, however I am not convinced that this is the best fix.
So my question is, how do I stop WP from generating a VIRTUAL ROBOTS.TXT in the first place?
Surely there is somewhere in the code that I can direct wordpress to serve my ROOT files instead of making one up on the fly?
Or can I simply STOP the virtual file being created/served and my own version being serves instead?
ANY help on this would be very much appreciated, as I have only been using wordpress for a few weeks and in sure that other have seen this ussue in the past?