• I am trying to get Google to correctly crawl and index my site. Despite uploading a robots.txt file directly to the site’s directory as follows:
    User-Agent: *
    Disallow:
    This should tell the bot that the entire site is allowed. But after requesting that Google test the robots.txt file, Google displays a version that shows two lines of “disallowed” paths as follows:
    User-Agent: *
    Crawl-delay: 1
    Disallow: /wp-content/plugins/
    Disallow: /wp-admin/

    Why is it changing from what I specified in the robots.txt file and uploaded?!?!?

Viewing 1 replies (of 1 total)
  • Do you have an SEO plugin installed? Some try to “help” with the robots.txt file.

    Have you browsed (with a browser) your own robots.txt to see what appears to be in it?

Viewing 1 replies (of 1 total)
  • The topic ‘Robots.txt file arbitrarily changed’ is closed to new replies.