OK, OK, I know this has been asked a million times but I must be missing something.
I have a WordPress site and when I check Google Webmaster Tools it states "Severe health issues are found on your site". When I select the site, go to Crawl, and then the robots.txt Tester, it states "Latest version seen on 7/13/14, 1:58 PM OK (200) 26 bytes". It states that the contents of my robots.txt file is:
This is not true.
First, I have XML Sitemap Generator for WordPress 4.0.7 as my plugin and "Add sitemap URL to the virtual robots.txt file." is NOT checked. I have my own robots.txt file in my Home directory that contains the following:
So, what gives? I've notified Google about my sitemap as generated by the plugin, so why has it been several days and it still thinks my site contains a robots.txt file that is blocking search engines from indexing my site? I thought that if I had an actual robots.txt file that the virtual robots.txt file that WP creates would be ignored? Where do I locate the code for the virtual robots.txt file?