WordPress.org

Ready to get started?Download WordPress

Forums

File permission changes altering robot.txt (10 posts)

  1. colmad
    Member
    Posted 1 year ago #

    I am receiving a severe health waring message of Google Webmaster Tools telling me the Googlebot cannot access our site because of this robot.txt file:

    User-agent: *
    Crawl-delay: 20

    User-agent: 008
    Disallow: /

    I requested my host to change the standard WP file permissions to make our site more secure and wondered if this could have caused the robot.txt file to read like this? Our site was hacked about 4 months ago.

  2. esmi
    Theme Diva & Forum Moderator
    Posted 1 year ago #

    Changing permissions would not have impacted on robots.txt. Have you tried smply removing the file?

  3. colmad
    Member
    Posted 1 year ago #

    Thanks Esmi, could the robot.txt file be a result of the hack?

  4. esmi
    Theme Diva & Forum Moderator
    Posted 1 year ago #

    I doubt it - simply because the hacker has nothing to gain from it. especially if that's all the robots.txt file contains. Your hosts might have changed it, though.

  5. colmad
    Member
    Posted 1 year ago #

    Thanks Esmi.

    Would I simply edit to read:

    User-agent: *
    Crawl-delay: 20

    User-agent: 008

    Just taking out the "disallow"?

  6. esmi
    Theme Diva & Forum Moderator
    Posted 1 year ago #

    Personally, I'd take out the lot.

  7. colmad
    Member
    Posted 1 year ago #

    Google state:

    (You need a robots.txt file only if your site includes content that you don't want search engines to index. If you want search engines to index everything in your site, you don't need a robots.txt file—not even an empty one. If you don’t have a robots.txt file, your server will return a 404 when Googlebot requests it, and we will continue to crawl your site. No problem.)

    Would I be better off removing the robots.txt?

    Then again if there is a hack it may be that the file will keep be regenerated.

  8. colmad
    Member
    Posted 1 year ago #

    Thanks Esmi.

  9. esmi
    Theme Diva & Forum Moderator
    Posted 1 year ago #

    If you want, you could have a look at the robots.txt example in this Codex section. Or use something like http://wordpress.org/extend/plugins/google-sitemap-generator/ to generate both a Google sitemap and a robots.txt file for you.

  10. hehafner
    Member
    Posted 1 year ago #

    Esmi -- I use google-sitemap-generator and I'm still getting a similar message from Google.

    Over the last 24 hours, Googlebot encountered 1 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 100.0%.

    When I do a fetch as Google, it finds the file just fine... (altho I don't find the file on my server anywhere). I assume it is a virtual or dynamic robot.txt... however, The link given for the location of my file doesn't really make sense. it says http://mywebsite.com//robots.txt

    Anyway, the google-sitemap-generator doesn't seem to help.

Topic Closed

This topic has been closed to new replies.

About this Topic