[resolved] [Plugin: Invisible Defender] Googlebots are unable to crawl through my site (2 posts)

  1. CallUp
    Posted 6 years ago #

    I was wondering if this plug-in is blocking legit search engine bots, because I have my site indexed at Google and I was notified that Google was getting crawl errors at my site? This is what it read:

    Crawl Errors: Restricted by robots.txt ‎(4)‎
    Detail: URL restricted by robots.txt

    I don't have a robots.txt file, so my site should be accessible to Googlebots. This is the only plug-in I have that is blocking bots of any kind. Could it be that this is blocking Google?

    Thanks for your help.


  2. Daniel Fru?y?ski (sirzooro)
    Posted 6 years ago #

    Invisible Defender does not do anything with robots.txt.

    WordPress generates virtual robots.txt file by itself, so ot is possible that other plugin added some extra entries to it. Please try to navigate to yoursite.com/robots.txt in your browser. WordPress generates only two lines listed below; if you will see something extra, it is added by other plugin(s).

    User-agent: *

Topic Closed

This topic has been closed to new replies.

About this Topic