WordPress.org

Ready to get started?Download WordPress

Forums

Robots.txt - Why Does WordPress Say To Disallow All PHP Pages (2 posts)

  1. ademartin
    Member
    Posted 7 years ago #

    Hi,

    Just getting to grips with a duplicate content issue on my site. To be brief using a combination of the plugin All In One SEO and a robots.txt file.

    I can't understand why the suggested robots.txt file for WordPress, (posted on the WP support forum I think), disallows all your PHP pages.

    I am sure I am not the only one who has php pages on their site that they want to have indexed.

    So,
    1. Why disallow the WP PHP pages
    2. When I understand that it is a good thing to do, how do I disallow those pages but allow the ones I want Google to find?

    Thanks in advance,

    Ade

  2. omgitztrey
    Member
    Posted 7 years ago #

    well some files just weren't meant to be seen. I just keep the wp-core files and some processing files like .dll from public view for potential security reasons. They don't affect your rankings or anything. Here's my robots.

    User-agent (*) talks to all crawlers.
    then you disallow or allow whatever you want indexed.

    User-agent: *
    # disallow all files in these directories
    Disallow: /wp-admin/
    Disallow: /wp-includes/
    Disallow: /wp-content/
    Allow: /images/

    # disallow all files ending with these extensions
    Disallow: /*.php$
    Disallow: /*.php*
    Disallow: /wp-*
    Disallow: /*.dll
    Disallow: /*.ini

Topic Closed

This topic has been closed to new replies.

About this Topic