WordPress.org

Ready to get started?Download WordPress

Forums

Unreachable Robots.txt Error (9 posts)

  1. probookings
    Member
    Posted 5 months ago #

    I'm using wordpress on this site...

    http://www.probookings.com

    sitemap is here: http://www.probookings.com/sitemap.xml

    robots is here: http://www.probookings.com/robots.txt

    - I keep getting robots.txt errors when I try to test my sitemap.

    can not figure this out - have been trying for days.

    for awhile I was getting "network unreachable" errors too - but I was always able to access the pages fine.

    what's going on here?

  2. Gregg Banse
    Member
    Posted 5 months ago #

    I can reach those files fine. What are you using to test with?

  3. probookings
    Member
    Posted 5 months ago #

    yes I can read them too it is Google from within webmaster tools. When I do the sitemap test in there it gives me the error.

  4. Gregg Banse
    Member
    Posted 5 months ago #

    Ah. Have you cleared the error to be sure it's still a current issue?

  5. probookings
    Member
    Posted 5 months ago #

    yeah - there are no errors - all are cleared - yet when I try again I still get an "unreachable error" within webmaster tools with my robots.txt file.

    According to google it may be this....

    Your hosting provider may be blocking Googlebot, or there may be a problem with the configuration of their firewall.

    .....

    but my hosting company GVO keeps pointing there finger back at google.

  6. Andrew
    Forum Moderator
    Posted 5 months ago #

    What reason do they provide when pointing at Google?

  7. probookings
    Member
    Posted 5 months ago #

    this is what google says...

    +++++++++++++++++++++++++++++++

    Unreachable: robots.txt

    Before we crawled the pages of your site, we tried to check your robots.txt file to ensure we didn't crawl any pages that you had roboted out. However, your server returned a 5xx (unreachable) error when we tried to retrieve your robots.txt file. To make sure we didn't crawl any pages listed in that file, we postponed our crawl.

    Your hosting provider may be blocking Googlebot, or there may be a problem with the configuration of their firewall.

    ++++++++++++++++++++++++++++++++++++

    I've also tried doing fetches of other pages of my site from within google and get the same errors.

    To me it looks like my host has the googlebot blocked for some reason. I can see any page on my site http://www.probookings.com - everytime.

    Can't think of anything else.

  8. Gregg Banse
    Member
    Posted 5 months ago #

    What does GVO say? Have you tried using a proxy as Googlebot and seeing if you can get to the robots.txt file?

  9. priya.jain21
    Member
    Posted 2 weeks ago #

    Hi
    Similar issue with my site.
    Please help with the definite solution.

Reply

You must log in to post.

About this Topic

Tags