WordPress.org

Ready to get started?Download WordPress

Forums

[resolved] Site Indexing errors (3 posts)

  1. mobeustech
    Member
    Posted 1 year ago #

    I am getting the following errors when I submit a sitemap to google or bing. The error that I get is "Sitemap contains urls which are blocked by robots.txt". It shows that I have submitted 145 items and that 44 items are blocked by the robots.txt file.

    I have added the following to my robots.txt

    User-agent: *
    Allow: /
    Disallow:
    Sitemap: http://www.mobeustech.com/sitemap_index.xml

    I have checked the reading page in my admin panel and made sure that Discourage Search Engine Visibility was unchecked.

    My .htaccess file is the wordpress default for %postname%. I really have no ideas on what to look at.

    My server is centOS 6.2, php 5.3, mysql 5.2, apache 2.2

  2. Krishna
    Volunteer Moderator
    Posted 1 year ago #

    The purpose of a robots text file is basically to block bots. The first line of your file allows all bots. So, just keep that unless you have some files to block.
    Review: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=156449

    Also, see in you have noindex,nofollow settings in your SEO plugins, theme settings or else where?

  3. mobeustech
    Member
    Posted 1 year ago #

    I cant find any meta robots that have the noindex,nofollow. The only plugins that I am currently using are Yaost's SEO and Google Analytics.

    Looks like it is a 24 hour refresh from google on when they refresh/update a robots file. It is working now.

Topic Closed

This topic has been closed to new replies.

About this Topic