Googlebot is blocked from Website (4 posts)

  1. menton
    Posted 3 years ago #

    We have just launched a website (www.gocar.ie). For the first day the box to ask Search Engines not to index this site was ticked - I changed it to allow search engines to index this site.

    However, I am getting an issue with the robots.txt file in Google Webmaster tools - and no matter what I try, I get this message: "The page could not be crawled because it is blocked by robots.txt"

    My robots.txt is as follows, but I don't believe this to be this issue:

    User-agent: *
    Disallow: /wp-admin/
    Disallow: /wp-includes/

    Anybody got any ideas? Google Webmaster Tools is saying Googlebot is blocked from the site, and the sitemap has 800 issues (www.gocar.ie/sitemap.xml) despite the fact there are only 32 pages in the sitemap.xml.

    All help is appreciated - can't get my head around this issue.

  2. cjc1867
    Posted 3 years ago #

    You need to fix the sitemap but robots.txt can take some time to change as a googlebot needs to visit your website and check the robots.txt file and update into your google webmaster tools.

    How have you created your sitemap?

  3. menton
    Posted 3 years ago #

    I have, http://www.gocar.ie/sitemap.xml using Google Sitemap Generator Plugin

  4. cjc1867
    Posted 3 years ago #

    You can check to see if it validates and I have just used two and they say it is ok. Again you will have to keep checking back as you will have to wait for the Google Bot to check things.

    Maybe delete it and resubmit again just in case.

Topic Closed

This topic has been closed to new replies.

About this Topic