Support » Fixing WordPress » Googlebot is blocked by robots.txt

  • Resolved tashb


    I have a problem with getting Google to index my WordPress site.

    I am getting errors and warnings saying “Googlebot is blocked” from my site and “URL restricted by robots.txt” in my sitemap uploads.

    Initially while I was creating my website I ticked the ‘Ask search engines not to index this site” in the Privacy settings. I have now ticked “Allow search engines to index this site” option, and I have checked It looks fine and when I test in it confirms that my robots.txt exists and allows search engines to index.

    Has anyone experienced this problem? Is there a solution I can try to overcome the googlebot problem? Will the problem go away when Google next tries to index my site or will it remain blocked?

    I am very concerned that I will not be able to get my site listed on Google searches because of this.

    Thank you,

Viewing 11 replies - 1 through 11 (of 11 total)
  • Check:

    Admin Dashboard>Settings>General>Privacy

    What is setting?

    The Privacy setting is currently “Allow search engines to index this site”. I changed this yesterday.

    I changed this yesterday.

    Use Google Webmaster Tools to re-index and re-submit your robots.txt file. If using a caching plugin make sure to flush it.

    Hi Seacoast,

    Thanks for your help. Can you please explain to me what a caching plugin is and also, how would I go about flushing this if I am using one?

    Thank you,

    Thanks, Seacoast.

    I googled ‘caching plugin’ and found out you can add a caching plugin to your website that generates static html files from the php code, improving browsing performance for people viewing the site.

    I’m not using a caching plugin so I don’t have any html files to flush/regenerate.


    Has anyone else experienced the googlebot robots.txt blocking problem when they’ve changed the site Privacy setting from “Ask search engines not to index this site” to “Allow search engines to index this site”?

    Does this problem go away by itself or will Google not be able to index my site? What can I do to resolve this?

    Thanks for your help,

    A) Please provide a site link for us to review all this. Without, your queries are moot.

    B) Server caching, when done properly with a supported and well maintained plugin such as WP Super Cache ‘records’ the generated source code (HTML) in a simple manner and sends that to your site visitors. It reduces the page load time and combined with proper local browser caching rules allows the local browser to carry the weight on retun visits.

    As for Google – wait. Add good and unique content. It’s not overnight.

    Thanks, Seacoast. Site link is pretty much as above without the robots.txt –

    I’ll try WP Super Cache too. Our page load time is very slow so this is very useful to know.

    Will work on improving the content also, and wait 🙂




    I have a similar problem described here and despite it being set as ‘resolved’ it is still a problem and I don’t have a solution.

    @ruatuki, the file does not appear to be web readable – contact your webhost and seek their assistance in resolving this issue.

    I waited a couple of days, then went back into Google Webmaster Tools. When I went to Fetch as Google & hit the ‘Fetch’ button, the status was ‘succeeded’ and I was able to submit the index.

    I did nothing other than wait, although I did send a tweet with my website address referenced in it in the hope this would make Google crawl my site again sooner, so as to find the new, non-blocking, robots.txt file.

    Just wanted to provide an update that waiting works.

    Thanks to everyone for their help.

Viewing 11 replies - 1 through 11 (of 11 total)
  • The topic ‘Googlebot is blocked by robots.txt’ is closed to new replies.