I’m having issues with the virtual robots file that blocks my site. I’ve searched in blogs for solutions but couldn’t find one that works. Lots of solution suggested were for older version of wp, the only thing I’ve changed in settings is to uncheck this option ->”Discourage search engines from indexing this site”. But when I search for the domain in google i get this message “A description for this result is not available because of this site’s robots.txt” and in webmaster tools when I try to submit the sitemap it gives me this error “URL restricted by robots.txt”.
I’ve tried to create a new robots file to overwrite the virtual one and this is what it comes out http://happygreenfood.com/robots.txt
What should I do to solve this problem? I don’t need and don’t want a robots file.
- The topic ‘How to delete virtual Robots.txt file?’ is closed to new replies.