Support » Fixing WordPress » Blocked by robots.txt issue

  • Hello everyone , im quite new to WP and i see alot of people having this issue when submiting sitemap to google , but i didnt realy find a solution so im writing here hoping someone will be able to help me . After uploading sitemap to google and testing it i got that all pages have Sitemap contains urls which are blocked by robots.txt.

    -i have checked my privacy and its set to allow search engines
    -I am using XML sitemap generator
    -Fetch as google is success
    -Using a WP Robots Txt plugin to check content of virtual robots.txt file and it contains only :

    User-agent: *
    Disallow: /wp-admin/
    Disallow: /wp-includes/

    -i also have robots.txt file in root directory on my server and it only has one line : sitemap:

    Can anyone help me with this issue please


Viewing 8 replies - 1 through 8 (of 8 total)
  • A site link will reveal your sites robots file, without, we punt.

    i am sorry , do you mean you need my websites robots.txt URL ? its

    Yes, but with so many unknown scripts, I defer

    When did you submit your Google Sitemap to Google Webmaster Tools?
    If only done recently then you need to allow time for a Google Bot to revisit your site to update the new settings. It can take a few hours & upto a day as I have noticed for my Robots.txt to change.

    Thanks for reply , i have submited sitemap 24h ago , it still says pending , how do i know google visited my site ?

    I use a plugin called NewStatPress and it tells me who visits the site including spiders, bots etc.

    There are different Google Bots that do different jobs so don’t take it that the visits you do have is the one that checks your Robots.txt.

    So far today I have had three Google Bots but this plugin doesn’t tell me their IP address so don’t know which ones they are.

    Do you think i should just wait and it will fix it on its own , or i should do something about it ? if the fetch as google is success does it mean its not actualy blocked ?




    Try this:

    User-agent: *
    Allow: /*?$

    Make a robots text file only with the above syntax in it and replace your current file with it. You have to place it in your root directory.

    You can see more about Robot text HERE.

    After uploading the robot text file, resubmit your site map.

Viewing 8 replies - 1 through 8 (of 8 total)
  • The topic ‘Blocked by robots.txt issue’ is closed to new replies.