Okay, I’m having a bit of trouble launching a site of mine. Last Friday, I turned off the setting that discourages search engines from index my site. I cleared my cache. I also reinstalled SEO by Yoast. I also submitted my sitemap to Google Webmaster Tools. I know that things don’t happen over night, but it’s been nearly a week and I still get this error on Google Webmaster Tools. “URL restricted by robots.txt”
I’ve looked everywhere on the site and in the source code and cannot figure out why robot.txt are still blocking my site.
Does anyone have any suggestions on where I am missing information?
- The topic ‘Robot.txt Blocking Search Engines’ is closed to new replies.