I was wondering if this plug-in is blocking legit search engine bots, because I have my site indexed at Google and I was notified that Google was getting crawl errors at my site? This is what it read:
Crawl Errors: Restricted by robots.txt (4)
Detail: URL restricted by robots.txt
I don’t have a robots.txt file, so my site should be accessible to Googlebots. This is the only plug-in I have that is blocking bots of any kind. Could it be that this is blocking Google?
Thanks for your help.
- The topic ‘[Plugin: Invisible Defender] Googlebots are unable to crawl through my site’ is closed to new replies.