Support » Fixing WordPress » Restricted by robot

Viewing 5 replies - 1 through 5 (of 5 total)
  • esmi

    (@esmi)

    Forum Moderator

    You don’t need to. If you’ve submitted a sitemap to Google, it will assume that the urls used in that sitemap are your preferred urls. It uses this preference when returning results for your site to ensure that it doesn’t list each page twice – one with www and once without. This doesn’t harm your site’s indexing or page rank in any way.

    esmi

    (@esmi)

    Forum Moderator

    Another thought – do you have a robots.txt file in your root folder? Have you checked that to see if it restricts the www domain?

    Hi there,

    Thanks for the explanation and the help. That makes sense…

    Yes now I have a robots.txt file in the root as of last night. I uploaded this file:

    User-agent: *
    Allow: /

    This should work, correct?

    Your robots.txt is not blocking anything so it’s not a problem.

    Great! thanks….

Viewing 5 replies - 1 through 5 (of 5 total)
  • The topic ‘Restricted by robot’ is closed to new replies.