WordPress.org

Ready to get started?Download WordPress

Forums

XML Sitemap & Google News feeds
[resolved] Is robots.txt rule working? (6 posts)

  1. eeegamer
    Member
    Posted 3 months ago #

    Hi,

    I added a rule to the robots.txt, but when I access the sitemap.xml the rule is not working yet. Does it take some time to exclude it?

    Here is the rule I added:

    Disallow: */tag/

    But in sitemap.xml it still shows all URL's with */tag/*.

    Any idea?

    Thank you very much!

    https://wordpress.org/plugins/xml-sitemap-feed/

  2. eeegamer
    Member
    Posted 3 months ago #

    I am not using a static robot.txt - at least I do not know of any. How can I find out?

  3. RavanH
    Member
    Plugin Author

    Posted 3 months ago #

    Hi eeegamer,

    The field "Additional robots.txt rules" is only to give some extra (manual) control over the dynamic robots.txt output. It does not control the output if the dynamic sitemaps.

    to exclude all tag URLs from the sitemap you'll need to go back to Settings > Reading and uncheck the box "Tags" at "Include taxonomies" in the XML Sitemap section.

  4. eeegamer
    Member
    Posted 3 months ago #

    Hi RavanH,

    thanks for your quick response. The mentioned box was unchecked all the time, so I do not know what the problem could be. I checked/unchecked it now to see if it works.

  5. eeegamer
    Member
    Posted 3 months ago #

    Category is unchecked, too, but it shows all my categories in my sitemap.

  6. RavanH
    Member
    Plugin Author

    Posted 3 months ago #

    Did you remove any static sitemap files from your site root? Did you make sure you have no other sitemap generating plugins active? It does not sound like the sitemap is from my plugin...

    Can you share a link?

Reply

You must log in to post.

About this Plugin

About this Topic