Support » Plugins and Hacks » [Plugin: Invisible Defender] Googlebots are unable to crawl through my site

  • Resolved CallUp

    (@callup)


    Hi,
    I was wondering if this plug-in is blocking legit search engine bots, because I have my site indexed at Google and I was notified that Google was getting crawl errors at my site? This is what it read:

    Crawl Errors: Restricted by robots.txt ‎(4)‎
    Detail: URL restricted by robots.txt

    I don’t have a robots.txt file, so my site should be accessible to Googlebots. This is the only plug-in I have that is blocking bots of any kind. Could it be that this is blocking Google?

    Thanks for your help.

    http://wordpress.org/extend/plugins/invisible-defender/

Viewing 1 replies (of 1 total)
  • Invisible Defender does not do anything with robots.txt.

    WordPress generates virtual robots.txt file by itself, so ot is possible that other plugin added some extra entries to it. Please try to navigate to yoursite.com/robots.txt in your browser. WordPress generates only two lines listed below; if you will see something extra, it is added by other plugin(s).

    User-agent: *
    Disallow:
Viewing 1 replies (of 1 total)
  • The topic ‘[Plugin: Invisible Defender] Googlebots are unable to crawl through my site’ is closed to new replies.