Excluding Categories /page/ in Robots.txt (7 posts)

  1. travelrants
    Posted 8 years ago #


    I would like to blog the search engines from indexing the /page/ URLs within the category, i.e.



    I do not want to exclude


    Any ideas how I do this in the robots.txt file or any other way of doing this within the code etc.


  2. moshu
    Posted 8 years ago #

    Maybe if you can explain what do you hope to achieve by that... somebody will have an idea.

  3. travelrants
    Posted 8 years ago #

    Thanks for replying Moshu.

    Each of these pages have the same title and description, and I understand this could cause duplicate content issues with the search engines.

    Is it worth nofollowing all of the category pages?

    It's just that the first category page generate visitors, and I wouldn't want it to affect traffic, but then I don't want to be penalised for dup content.

    Does this help?

  4. whooami
    Posted 8 years ago #

    you will not be penalized for duplicate content .. I use, as do thousands of other ppl the same set up. I have a PR6 blog in some places. Just fricking let WP do its thing, you will be fine.

  5. travelrants
    Posted 8 years ago #


    Anyone else have an opinion?

  6. whooami
    Posted 8 years ago #

    they do, and its the same as mine. The only ppl here that worry about duplicate content are the ppl that dont have any experience with WP, and come here asking questions JUST like yours.


    Its really that simple.

  7. whooami
    Posted 8 years ago #

    Ill even go one better (I thought of these things while I was outside smoking)

    1. google's own blog:


    If you look at it, they have single post pages, and archive pages with multiple posts within those.

    Their robots.txt:


    User-agent: Mediapartners-Google
    User-agent: *
    Disallow: /search
    Sitemap: http://googleblog.blogspot.com/feeds/posts/default?orderby=updated

    2. Googlebot, et al, are 'smarter' than anyone gives them credit for. The people that put together those algorithms understand how blogs work.

    3. Unless you plan on putting the exact same thing on an archive page, ie, you are displaying one post AND the comments stuff, you are not duplicating content (again, see #2, they're smarter than that).

    4. You (generally speaking) are limiting your user's experience on your web site.

    Let's suppose I search for widgets, and a site comes up that is blocking this, blocking that, in order to "not get penalized for duplicate content (ignoring the fact that it wont happen). I click the link off Google -- I go to a single post page. Thats great -- I got my widget info. I leave.

    Now lets say you search for widgets, and my site comes up. Im not an seo freak, I just let WP and searchbots do their dance. You click the link, and you may or may not be taken to a single post page. If you arent -- well guess what, I have other posts on washers, and nuts and bolts there. Maybe you want some of those before you leave?

    In other words, you lost traffic, because the other great content you have wasnt there for them to see. Me, I got the traffic.

Topic Closed

This topic has been closed to new replies.

About this Topic