• I’m using WP version 2.7.1 and I’m working on the SEO of my website.

    I just finished using the XML Sitemap Generator for WordPress 3.1.2 (Google XML Sitemaps plugin – which by the way only lists compatibility to WP version 2.7) The Sitemap was created and working fine! (see it here.)

    Now I’m working on a robots.txt file for web-bots. However I seem to be reading contradicting information on the wordpress.org site. There is a great deal of discussion about creating a robots.txt file yet there is also some chatter about WP automatically generating a virtual robots.txt file.

    So…

    Do I create an actual robots.txt file, or do I somehow modify the virtual robots.txt file?

Viewing 3 replies - 1 through 3 (of 3 total)
  • I second this question – any one know the answer ?

    I am dealing with this too. My thought it to get a robots.txt plug-in that enables modifications to the virtual robots.txt. I am also having a problem with Google Webmaster Tools giving me a 404 not found robots.txt even though the virtual URL is working

    Contradicting information is all I’m finding as well. Currently my site isn’t being indexed by google, I receive the following message from google webmasters tools:

    “The page could not be crawled because it is blocked by robots.txt.”

    I’ve read a virtual robots.txt is generated in wordpress but when I check my site.com/robots.txt it shows “Disallow: /”

    So far I made sure my Blog Visibility under Private Settings is set to “visible to everyone, including search engines”

    No changes.

    I made a new robots.txt with “User-agent: * Allow: /”, uploaded to my root.

    No changes.

    What other options am I left with? Should I deactivate XML Sitemap Generator for WordPress 3.1.2?

Viewing 3 replies - 1 through 3 (of 3 total)
  • The topic ‘SEO and the robots.txt file’ is closed to new replies.