We have just launched a website (www.gocar.ie). For the first day the box to ask Search Engines not to index this site was ticked - I changed it to allow search engines to index this site.
However, I am getting an issue with the robots.txt file in Google Webmaster tools - and no matter what I try, I get this message: "The page could not be crawled because it is blocked by robots.txt"
My robots.txt is as follows, but I don't believe this to be this issue:
Anybody got any ideas? Google Webmaster Tools is saying Googlebot is blocked from the site, and the sitemap has 800 issues (www.gocar.ie/sitemap.xml) despite the fact there are only 32 pages in the sitemap.xml.
All help is appreciated - can't get my head around this issue.