WordPress.org

Support

Support » How-To and Troubleshooting » File permission changes altering robot.txt

File permission changes altering robot.txt

  • I am receiving a severe health waring message of Google Webmaster Tools telling me the Googlebot cannot access our site because of this robot.txt file:

    User-agent: *
    Crawl-delay: 20

    User-agent: 008
    Disallow: /

    I requested my host to change the standard WP file permissions to make our site more secure and wondered if this could have caused the robot.txt file to read like this? Our site was hacked about 4 months ago.

Viewing 9 replies - 1 through 9 (of 9 total)
  • esmi

    @esmi

    Forum Moderator

    Changing permissions would not have impacted on robots.txt. Have you tried smply removing the file?

    Thanks Esmi, could the robot.txt file be a result of the hack?

    esmi

    @esmi

    Forum Moderator

    I doubt it – simply because the hacker has nothing to gain from it. especially if that’s all the robots.txt file contains. Your hosts might have changed it, though.

    Thanks Esmi.

    Would I simply edit to read:

    User-agent: *
    Crawl-delay: 20

    User-agent: 008

    Just taking out the “disallow”?

    esmi

    @esmi

    Forum Moderator

    Personally, I’d take out the lot.

    Google state:

    (You need a robots.txt file only if your site includes content that you don’t want search engines to index. If you want search engines to index everything in your site, you don’t need a robots.txt file—not even an empty one. If you don’t have a robots.txt file, your server will return a 404 when Googlebot requests it, and we will continue to crawl your site. No problem.)

    Would I be better off removing the robots.txt?

    Then again if there is a hack it may be that the file will keep be regenerated.

    Thanks Esmi.

    esmi

    @esmi

    Forum Moderator

    If you want, you could have a look at the robots.txt example in this Codex section. Or use something like http://wordpress.org/extend/plugins/google-sitemap-generator/ to generate both a Google sitemap and a robots.txt file for you.

    Esmi — I use google-sitemap-generator and I’m still getting a similar message from Google.

    Over the last 24 hours, Googlebot encountered 1 errors while attempting to access your robots.txt. To ensure that we didn’t crawl any pages listed in that file, we postponed our crawl. Your site’s overall robots.txt error rate is 100.0%.

    When I do a fetch as Google, it finds the file just fine… (altho I don’t find the file on my server anywhere). I assume it is a virtual or dynamic robot.txt… however, The link given for the location of my file doesn’t really make sense. it says http://mywebsite.com//robots.txt

    Anyway, the google-sitemap-generator doesn’t seem to help.

Viewing 9 replies - 1 through 9 (of 9 total)
  • The topic ‘File permission changes altering robot.txt’ is closed to new replies.
Skip to toolbar