Support » Fixing WordPress » Site Indexing errors

  • Resolved mobeustech


    I am getting the following errors when I submit a sitemap to google or bing. The error that I get is “Sitemap contains urls which are blocked by robots.txt”. It shows that I have submitted 145 items and that 44 items are blocked by the robots.txt file.

    I have added the following to my robots.txt

    User-agent: *
    Allow: /

    I have checked the reading page in my admin panel and made sure that Discourage Search Engine Visibility was unchecked.

    My .htaccess file is the wordpress default for %postname%. I really have no ideas on what to look at.

    My server is centOS 6.2, php 5.3, mysql 5.2, apache 2.2

Viewing 2 replies - 1 through 2 (of 2 total)
  • The purpose of a robots text file is basically to block bots. The first line of your file allows all bots. So, just keep that unless you have some files to block.

    Also, see in you have noindex,nofollow settings in your SEO plugins, theme settings or else where?

    I cant find any meta robots that have the noindex,nofollow. The only plugins that I am currently using are Yaost’s SEO and Google Analytics.

    Looks like it is a 24 hour refresh from google on when they refresh/update a robots file. It is working now.

Viewing 2 replies - 1 through 2 (of 2 total)
  • The topic ‘Site Indexing errors’ is closed to new replies.