Support » Requests and Feedback » Permalinks, Duplicate Content and Google.

  • Hi, I run a Cambodian cooking and recipe blog at on a triplek2 theme and have recently updated my permalink structure. I moved from the /?p=** style to /year/month/day/post title.

    Almost immediately google analytics started recognizing both sets of urls and noting the duplication in diagnostics under “duplicate title tags” and “duplicate meta details.”

    I’ve now copped a 65 place penalty that takes me well off the front page.

    My question(s) is:

    Is there some way to stop google from seeing/recognizing the old url’s and punishing me for duplicating them?

    And where might the bot be seeing them?

    It was my impression from the documentation that I didn’t need 301 redirects. Do I in fact need them?

    Please help soon.

Viewing 8 replies - 1 through 8 (of 8 total)
  • you need to define each page with diff meta descriptions (on header). the title & descriptions should be partially unique.

    <meta name="description" content="hot curry whatever"/>

    check WP plugins for meta, header or seo type plugins .

    I don’t think that’s quite what I mean.

    Google appears to be suggesting that the meta descriptions are duplicated between 1-The old urls and 2-the new urls. And it’s across the board. The meta on every post is considered a duplicate of it’s older (differently url-ed) self.

    I’ve hand written (a whole lot of) 301 redirects into my .htaccess file.

    Can anyone advise whether this was the right thing to do?

    Will it fix the problem?

    all your page (front,single,pages,archives,cat) has 2 meta description. its from default k2 theme & seo pack plugins, you should remove the default meta-description from k2 theme.

    second you have a sitemap, so I think you should update your robots.txt & wait for the next crawl & you could also validate the robots.txt rules and sitemaps via google-webmaster.

    the below robots.txt will disabled all “?page=” type from being crawl by googlebot

    example robots.txt

    User-agent: Googlebot
    Disallow: /*/trackback
    Disallow: /*/feed
    Disallow: /*/comments
    Disallow: /*?*
    Disallow: /*?
    User-agent: *
    Disallow: /cgi-bin/
    Disallow: /wp-admin/
    Disallow: /wp-includes/
    Disallow: /wp-contents/plugins/
    Disallow: /wp-contents/themes/
    Disallow: /trackback
    Disallow: /comments
    Disallow: /feed

    Thank you very much chaoskaizer, that looks like exactly what I need. The problem appears to be arising because the bot is still finding the *?= pages for some reason, and this will just stop it.


    I would also advise you to register your site in Google’s Webmaster Tools. This will allow you to see what google has indexed and remove anything you don’t want. It will also allow you to make sure your sitemap and robots.txt are in order and google sees them properly.

    I was just trying to figure this out. Thanks so much for the suggestion chaoskaizer… that rocks. requesting re-inclusion now…


    This was extremely helpful. I even added a new rule:
    Disallow: /page
    for exclusion of the previous/next pages in the wordpress blog. This got rid of all the duplicate title tags for new posts.

    However, for the old posts, Google tools still says I’ve got duplicate title tags.

    Does this disappear after a new total crawl of my page or do I have to remove the ?-links manually in google webmaster tools?

    Shouldn’t this
    Disallow: /wp-contents/plugins/
    Disallow: /wp-contents/themes/

    be without s at the end and look like this

    Disallow: /wp-content/plugins/
    Disallow: /wp-content/themes/

Viewing 8 replies - 1 through 8 (of 8 total)
  • The topic ‘Permalinks, Duplicate Content and Google.’ is closed to new replies.