Forum Replies Created

Viewing 3 replies - 1 through 3 (of 3 total)
  • Thread Starter probookings

    (@probookings)

    this is what google says…

    +++++++++++++++++++++++++++++++

    Unreachable: robots.txt

    Before we crawled the pages of your site, we tried to check your robots.txt file to ensure we didn’t crawl any pages that you had roboted out. However, your server returned a 5xx (unreachable) error when we tried to retrieve your robots.txt file. To make sure we didn’t crawl any pages listed in that file, we postponed our crawl.

    Your hosting provider may be blocking Googlebot, or there may be a problem with the configuration of their firewall.

    ++++++++++++++++++++++++++++++++++++

    I’ve also tried doing fetches of other pages of my site from within google and get the same errors.

    To me it looks like my host has the googlebot blocked for some reason. I can see any page on my site http://www.probookings.com – everytime.

    Can’t think of anything else.

    Thread Starter probookings

    (@probookings)

    yeah – there are no errors – all are cleared – yet when I try again I still get an “unreachable error” within webmaster tools with my robots.txt file.

    According to google it may be this….

    Your hosting provider may be blocking Googlebot, or there may be a problem with the configuration of their firewall.

    …..

    but my hosting company GVO keeps pointing there finger back at google.

    Thread Starter probookings

    (@probookings)

    yes I can read them too it is Google from within webmaster tools. When I do the sitemap test in there it gives me the error.

Viewing 3 replies - 1 through 3 (of 3 total)