Support » Fixing WordPress » Robots.txt – Why Does WordPress Say To Disallow All PHP Pages

  • ademartin



    Just getting to grips with a duplicate content issue on my site. To be brief using a combination of the plugin All In One SEO and a robots.txt file.

    I can’t understand why the suggested robots.txt file for WordPress, (posted on the WP support forum I think), disallows all your PHP pages.

    I am sure I am not the only one who has php pages on their site that they want to have indexed.

    1. Why disallow the WP PHP pages
    2. When I understand that it is a good thing to do, how do I disallow those pages but allow the ones I want Google to find?

    Thanks in advance,


Viewing 1 replies (of 1 total)
  • omgitztrey


    well some files just weren’t meant to be seen. I just keep the wp-core files and some processing files like .dll from public view for potential security reasons. They don’t affect your rankings or anything. Here’s my robots.

    User-agent (*) talks to all crawlers.
    then you disallow or allow whatever you want indexed.

    User-agent: *
    # disallow all files in these directories
    Disallow: /wp-admin/
    Disallow: /wp-includes/
    Disallow: /wp-content/
    Allow: /images/

    # disallow all files ending with these extensions
    Disallow: /*.php$
    Disallow: /*.php*
    Disallow: /wp-*
    Disallow: /*.dll
    Disallow: /*.ini

Viewing 1 replies (of 1 total)
  • The topic ‘Robots.txt – Why Does WordPress Say To Disallow All PHP Pages’ is closed to new replies.