[Plugin: Google XML Sitemaps] robots.txt (1 post)

  1. luffer
    Posted 5 years ago #

    I've added this plugin but it seems to be blocking crawlers without me asking it to. I want search engines to be able to crawl my site, why has the plugin disallowed them all?

    I recently noticed all my Google AdSense ads had vanished and my visitor numbers have plummeted. I contacted Google who told me my robot.txt file was blocking them from crawling my site. It had:

    User-agent: *
    Disallow: /

    Why has the plugin done this and how do I stop it? Why has it acted in this way without notification or asking permission? Why are there no settings to disable it?

Topic Closed

This topic has been closed to new replies.

About this Topic