Just getting to grips with a duplicate content issue on my site. To be brief using a combination of the plugin All In One SEO and a robots.txt file.
I can’t understand why the suggested robots.txt file for WordPress, (posted on the WP support forum I think), disallows all your PHP pages.
I am sure I am not the only one who has php pages on their site that they want to have indexed.
1. Why disallow the WP PHP pages
2. When I understand that it is a good thing to do, how do I disallow those pages but allow the ones I want Google to find?
Thanks in advance,
- The topic ‘Robots.txt – Why Does WordPress Say To Disallow All PHP Pages’ is closed to new replies.