first of all, great plug in. All the SEO stuff in one package, it’s great.
I installed it two days ago on http://shop.pieterzandvliet.com. I didn’t change a lot in the standard settings. I didn’t change anything at all in the “indexation” settings. But now, when I log in to google webmaster tools, I see that a lot of pages, and especially the tag pages, are forbidden by robots.txt. Of course I don’t want this (I think :S )
Strange thing is that google webmaster says that my http://shop.pieterzandvliet.com/robots.txt file has the following settings:
Meaning that no page should be blocked. Also I checked some “blocked” pages like:
But I can’t find any individual settings in the Yoast plugin that could block these pages. (these are just a few from the current 178 blocked pages)
Any solutions for this?
P.S. I do use the sitemap function of the Yoast plugin. The blocked pages ARE found in the sitemaps
- The topic ‘Yoast WordPress SEO plugin blocks pages with robots.txt’ is closed to new replies.