WordPress.org

Ready to get started?Download WordPress

Forums

[Plugin: Yoast WordPress SEO] blocks pages with robots.txt (1 post)

  1. Itam
    Member
    Posted 3 years ago #

    Hi,

    first of all, great plug in. All the SEO stuff in one package, it's great.
    I installed it two days ago on http://shop.pieterzandvliet.com. I didn't change a lot in the standard settings. I didn't change anything at all in the "indexation" settings. But now, when I log in to google webmaster tools, I see that a lot of pages, and especially the tag pages, are forbidden by robots.txt. Of course I don't want this (I think :S )
    Strange thing is that google webmaster says that my http://shop.pieterzandvliet.com/robots.txt file has the following settings:

    User-agent: *
    Disallow:

    Meaning that no page should be blocked. Also I checked some "blocked" pages like:

    http://shop.pieterzandvliet.com/tag/tekening/
    http://shop.pieterzandvliet.com/tag/kunst/
    http://shop.pieterzandvliet.com/tag/creatures/page/2/
    http://shop.pieterzandvliet.com/drawings/its-mine/

    But I can't find any individual settings in the Yoast plugin that could block these pages. (these are just a few from the current 178 blocked pages)

    Any solutions for this?

    P.S. I do use the sitemap function of the Yoast plugin. The blocked pages ARE found in the sitemaps

Topic Closed

This topic has been closed to new replies.

About this Topic