It is happening to my Multisite install of WP 3.4.1. The robots.txt file looks fine, by itself:
The file contents change when I switch between allow, do not allow options. Do not allow file contents looks like this:
When I switch it back, it looks fine again. I am using the XML Sitemap generator in the Yoast WordPress SEO Plugin--and, those Sitemaps are formed correctly.
I have turned the Yoast WordPress SEO plugin off, and the contents still look fine--no change from above. Google still says that Robots.txt is preventing them from crawling the site.
I also tried Network Activation of Yoast SEO vs. individual site activation, and all of those Robots.txt files look identical, and Google continues to say that it is being prevented from Crawling the site because of the setting on the robots.txt file.
Using Domain Mapping Plugin 0.5.4.
I also switched the theme to TwentyEleven, and no change in either the content of Robots.txt or Google's response.