you need to define each page with diff meta descriptions (on header). the title & descriptions should be partially unique.
ie
<meta name="description" content="hot curry whatever"/>
check WP plugins for meta, header or seo type plugins .
I don’t think that’s quite what I mean.
Google appears to be suggesting that the meta descriptions are duplicated between 1-The old urls and 2-the new urls. And it’s across the board. The meta on every post is considered a duplicate of it’s older (differently url-ed) self.
I’ve hand written (a whole lot of) 301 redirects into my .htaccess file.
Can anyone advise whether this was the right thing to do?
Will it fix the problem?
all your page (front,single,pages,archives,cat) has 2 meta description. its from default k2 theme & seo pack plugins, you should remove the default meta-description from k2 theme.
second you have a sitemap, so I think you should update your robots.txt & wait for the next crawl & you could also validate the robots.txt rules and sitemaps via google-webmaster.
the below robots.txt will disabled all “?page=” type from being crawl by googlebot
example robots.txt
Sitemap: http://tamarindtrees.net/sitemap.xml
User-agent: Googlebot
Disallow: /*/trackback
Disallow: /*/feed
Disallow: /*/comments
Disallow: /*?*
Disallow: /*?
User-agent: *
Disallow: /cgi-bin/
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-contents/plugins/
Disallow: /wp-contents/themes/
Disallow: /trackback
Disallow: /comments
Disallow: /feed
Thank you very much chaoskaizer, that looks like exactly what I need. The problem appears to be arising because the bot is still finding the *?= pages for some reason, and this will just stop it.
Thankyou!!!!
I would also advise you to register your site in Google’s Webmaster Tools. This will allow you to see what google has indexed and remove anything you don’t want. It will also allow you to make sure your sitemap and robots.txt are in order and google sees them properly.
817509
I was just trying to figure this out. Thanks so much for the suggestion chaoskaizer… that rocks. requesting re-inclusion now…
Hi,
This was extremely helpful. I even added a new rule:
Disallow: /page
for exclusion of the previous/next pages in the wordpress blog. This got rid of all the duplicate title tags for new posts.
However, for the old posts, Google tools still says I’ve got duplicate title tags.
Does this disappear after a new total crawl of my page or do I have to remove the ?-links manually in google webmaster tools?
Shouldn’t this
Disallow: /wp-contents/plugins/
Disallow: /wp-contents/themes/
be without s at the end and look like this
Disallow: /wp-content/plugins/
Disallow: /wp-content/themes/