Viewing 2 replies - 1 through 2 (of 2 total)
  • OK you definitely need to resolve one direction or the other, depending on which your preference is for your site (with or without the www.)

    SO if you like http://vegasguyvip.com/ then resolve the www. to non-www.

    If you like http://www.vegasguyvip.com/ then resolve the non-www. to www.

    Once you have that resolved and your site gets scanned again the ghost ‘duplicate’ robots.txt will go away.

    If you don’t know how to do this, here’s a good tutorial that shows both:
    http://dense13.com/blog/2008/02/27/redirecting-non-www-to-www-with-htaccess/

    Thread Starter chris99460

    (@chris99460)

    TrishaM, thank you so much for your reply. I have the website resolving to www from root now. However, for the life of me, I cannot get the robots.txt file to change. I have installed 4 different plugins, including Yoast to edit robots.txt file. They all show the code I want. However, when I go to http://www.vegasguyvip.com/robots.txt the robots.txt shows something completely different:
    User-agent: Yandex
    Disallow: /wp-admin
    Disallow: /wp-includes
    Disallow: /wp-login.php
    Disallow: /wp-register.php
    Disallow: /wp-content/themes
    Disallow: /wp-content/plugins
    Disallow: /wp-content/upgrade
    Disallow: /wp-content/themes_backup
    Disallow: /wp-comments
    Disallow: /cgi-bin
    Disallow: *?s=
    Host: http://www.vegasguyvip.com

    User-agent: *
    Disallow: /wp-admin
    Disallow: /wp-includes
    Disallow: /wp-login.php
    Disallow: /wp-register.php
    Disallow: /wp-content/themes
    Disallow: /wp-content/plugins
    Disallow: /wp-content/upgrade
    Disallow: /wp-content/themes_backup
    Disallow: /wp-comments
    Disallow: /cgi-bin
    Disallow: *?s=

    I have even added a physical file called robots.txt to the root via ftp, trying to override this default robots.txt

    Any ideas?

    Thanks so much! Been stuck on this for 9 days…

Viewing 2 replies - 1 through 2 (of 2 total)

The topic ‘Help Getting Two Robots.txt files!?!’ is closed to new replies.