WordPress.org

Ready to get started?Download WordPress

Forums

Prudent Robots.txt Strategy After Permanent 301 Redirect From Old to New Site (2 posts)

  1. lennoxtutoring
    Member
    Posted 1 month ago #

    I've been using WordPress for two years, built a site in 2012, realized I did a lot of dumb things with SEO, including the domain name, and have recently created a new site with better SEO. The new site has many of the components of the OLD site:

    http://lennoxtutoring.com

    (i.e. content), however a different and more user friendly permalink structure. A permanent 301 redirect from the old site has been effected, pointing it to the NEW site:

    http://phd-organic-chemistry-tutor.com

    When I publish articles, I use both tags and categories.

    QUESTION(s):
    1. What are the best steps to take to avoid any possibility of Googlebot "thinking" I might be duplicating content?
    2. What is a good configuration for robots.txt to augment SEO however avoid duplication of content.

    This is what I currently have in my robots.txt file:

    User-agent: *
    Disallow: /archives/
    Disallow: /category/*
    Disallow: /cgi-bin/
    Disallow: /wp-admin/
    Disallow: /wp-content/cache
    Disallow: /wp-content/plugins
    Disallow: /wp-content/reply
    Disallow: /wp-includes/
    Disallow: /trackback/

    Essentially, what I'd like to know in a nutshell is what's best to place in the robots.txt file to utilize the content of the new site without apparent duplication of content.

    Thank you!
    Joseph

  2. 1. Nothing. :) The Googlebot knows that a 301 redirect is a permanent redirect, and will even adjust the existing listings in Google's index to point to the new site.

    2. Likewise to above, if you are indeed issuing a 301 permanent redirect, you don't need to change anything there either.

Reply

You must log in to post.

About this Topic