• Resolved eeegamer

    (@eeegamer)


    Hi,

    I added a rule to the robots.txt, but when I access the sitemap.xml the rule is not working yet. Does it take some time to exclude it?

    Here is the rule I added:

    Disallow: */tag/

    But in sitemap.xml it still shows all URL’s with */tag/*.

    Any idea?

    Thank you very much!

    https://wordpress.org/plugins/xml-sitemap-feed/

Viewing 5 replies - 1 through 5 (of 5 total)
  • Thread Starter eeegamer

    (@eeegamer)

    I am not using a static robot.txt – at least I do not know of any. How can I find out?

    Plugin Author Rolf Allard van Hagen

    (@ravanh)

    Hi eeegamer,

    The field “Additional robots.txt rules” is only to give some extra (manual) control over the dynamic robots.txt output. It does not control the output if the dynamic sitemaps.

    to exclude all tag URLs from the sitemap you’ll need to go back to Settings > Reading and uncheck the box “Tags” at “Include taxonomies” in the XML Sitemap section.

    Thread Starter eeegamer

    (@eeegamer)

    Hi RavanH,

    thanks for your quick response. The mentioned box was unchecked all the time, so I do not know what the problem could be. I checked/unchecked it now to see if it works.

    Thread Starter eeegamer

    (@eeegamer)

    Category is unchecked, too, but it shows all my categories in my sitemap.

    Plugin Author Rolf Allard van Hagen

    (@ravanh)

    Did you remove any static sitemap files from your site root? Did you make sure you have no other sitemap generating plugins active? It does not sound like the sitemap is from my plugin…

    Can you share a link?

Viewing 5 replies - 1 through 5 (of 5 total)

The topic ‘Is robots.txt rule working?’ is closed to new replies.