On one of my sites Google sent me a message saying:
We noticed that the AdSense ad crawler is having some issues accessing your site on freegovernmentcellphones.net. The issue appears to lie within your robots.txt file, which is blocking our ad crawler from viewing certain sections of your site. Over a four day period earlier this month, we detected 156 failed crawl requests. Because of this, your AdSense ads are less targeted and are generating less revenue on average.
To fix this, you’ll need to edit your robots.txt file to allow our AdSense crawler by adding these two lines to the very top:
But as you can see from my robots.txt file below, I have "User-agent: *" at the top which lets in all user agents. Why would I have to add the one they suggest?
User-agent: * Disallow: /cgi-bin Disallow: /wp-admin Disallow: /wp-content/plugins Disallow: /wp-content/cache Disallow: /trackback Disallow: /feed Disallow: /comments Disallow: */trackback Disallow: */feed Disallow: */comments Disallow: /newsfeeds Disallow: */newsfeeds Disallow: /forum Disallow: /*?* Disallow: /*? Allow: /wp-content/uploads Allow: /wp-content/plugins/gd-star-rating/ Sitemap: http://www.mysite.com/sitemap.xml
By the way, the Disallow: /*?* and Disallow: /*? are to disallow all the crazy queries that get indexed like all the ones that end in "?replytocom=649"