Ready to get started?Download WordPress


Prudent Robots.txt Strategy After Permanent 301 Redirect From Old to New Site (3 posts)

  1. John Smith
    Posted 8 months ago #

    I've been using WordPress for two years, built a site in 2012, realized I did a lot of dumb things with SEO, including the domain name, and have recently created a new site with better SEO. The new site has many of the components of the OLD site:


    (i.e. content), however a different and more user friendly permalink structure. A permanent 301 redirect from the old site has been effected, pointing it to the NEW site:


    When I publish articles, I use both tags and categories.

    1. What are the best steps to take to avoid any possibility of Googlebot "thinking" I might be duplicating content?
    2. What is a good configuration for robots.txt to augment SEO however avoid duplication of content.

    This is what I currently have in my robots.txt file:

    User-agent: *
    Disallow: /archives/
    Disallow: /category/*
    Disallow: /cgi-bin/
    Disallow: /wp-admin/
    Disallow: /wp-content/cache
    Disallow: /wp-content/plugins
    Disallow: /wp-content/reply
    Disallow: /wp-includes/
    Disallow: /trackback/

    Essentially, what I'd like to know in a nutshell is what's best to place in the robots.txt file to utilize the content of the new site without apparent duplication of content.

    Thank you!

  2. 1. Nothing. :) The Googlebot knows that a 301 redirect is a permanent redirect, and will even adjust the existing listings in Google's index to point to the new site.

    2. Likewise to above, if you are indeed issuing a 301 permanent redirect, you don't need to change anything there either.

  3. John Smith
    Posted 2 months ago #

    James, is there any way to delete this topic from the forum. If you email me I'll explain exactly why.

    Thank you!


You must log in to post.

About this Topic