WordPress.org

Forums

Google webmaster robots.txt problem (5 posts)

  1. tudormink
    Member
    Posted 3 years ago #

    Hi,

    I've installed wordpress on to a domain here - http://shedalarms.com

    After verifying the site with google webmaster I was told the site is being blocked from google using the robot.txt file. I checked the privacy settings and it was indeed blocked so I made sure it was crawlable.

    My robots.txt file on the domain currently reads as follows (no forward slash so I assume nothing is blocked!) - http://shedalarms.com/robots.txt

    However in google webmaster it still reads as

    User-agent: *
    Disallow: /

    Sitemap: http://shedalarms.com/sitemap.xml.gz

    I've also tried adding my own robots file with allow all, but google is still reading it as above despite telling it to check my site again.

    Can anyone offer up any suggestions as to why this is happening? Do I just need to leave it to refresh or is something wrong?

    Thanks

  2. Tim S
    Member
    Posted 3 years ago #

    Sometimes in Google Webmaster Tools I've noticed it takes a bit to update and show the correct settings after I make a change. I'd just let it be for a while and see if it corrects itself.

  3. If you have any caching, remember to flush it.

  4. Thea
    Member
    Posted 3 years ago #

    I'm having the exact same problem.

  5. revium
    Member
    Posted 3 years ago #

    I agree with Tim, that I have seen it take a few days sometimes to update in Google Webmaster Tools.

    In my experience, even though it says there is still a conflict, I have found Google to be crawling the site once I changed the robots.txt, they just had not updated the error message in Webmaster Tools yet.

Topic Closed

This topic has been closed to new replies.

About this Topic

Tags