Viewing 5 replies - 1 through 5 (of 5 total)
  • You don’t need to. If you’ve submitted a sitemap to Google, it will assume that the urls used in that sitemap are your preferred urls. It uses this preference when returning results for your site to ensure that it doesn’t list each page twice – one with www and once without. This doesn’t harm your site’s indexing or page rank in any way.

    Another thought – do you have a robots.txt file in your root folder? Have you checked that to see if it restricts the www domain?

    Thread Starter jmor65

    (@jmor65)

    Hi there,

    Thanks for the explanation and the help. That makes sense…

    Yes now I have a robots.txt file in the root as of last night. I uploaded this file:

    User-agent: *
    Allow: /

    This should work, correct?

    Your robots.txt is not blocking anything so it’s not a problem.

    Thread Starter jmor65

    (@jmor65)

    Great! thanks….

Viewing 5 replies - 1 through 5 (of 5 total)
  • The topic ‘Restricted by robot’ is closed to new replies.