Support » Fixing WordPress » How to delete virtual Robots.txt file?

  • natasa.mandic


    Hello everybody,
    I’m having issues with the virtual robots file that blocks my site. I’ve searched in blogs for solutions but couldn’t find one that works. Lots of solution suggested were for older version of wp, the only thing I’ve changed in settings is to uncheck this option ->”Discourage search engines from indexing this site”. But when I search for the domain in google i get this message “A description for this result is not available because of this site’s robots.txt” and in webmaster tools when I try to submit the sitemap it gives me this error “URL restricted by robots.txt”.
    I’ve tried to create a new robots file to overwrite the virtual one and this is what it comes out

    What should I do to solve this problem? I don’t need and don’t want a robots file.

Viewing 1 replies (of 1 total)
  • Pioneer Web Design


    Works like this: Create one and add it to your site root. If one exists, WP will not use the virtual one. Also, let Google scrape your site again (and again and again). You need to give them time to index your site. Also use Google Webmaster Tools to submit both your robots.txt file and a sitemap. Many SEO plugins can do this also.

Viewing 1 replies (of 1 total)
  • The topic ‘How to delete virtual Robots.txt file?’ is closed to new replies.