Support » Plugins and Hacks » XML Sitemap & Google News feeds » [Resolved] Is robots.txt rule working?

[Resolved] Is robots.txt rule working?

Viewing 5 replies - 1 through 5 (of 5 total)
  • I am not using a static robot.txt – at least I do not know of any. How can I find out?

    Plugin Author RavanH


    Hi eeegamer,

    The field “Additional robots.txt rules” is only to give some extra (manual) control over the dynamic robots.txt output. It does not control the output if the dynamic sitemaps.

    to exclude all tag URLs from the sitemap you’ll need to go back to Settings > Reading and uncheck the box “Tags” at “Include taxonomies” in the XML Sitemap section.

    Hi RavanH,

    thanks for your quick response. The mentioned box was unchecked all the time, so I do not know what the problem could be. I checked/unchecked it now to see if it works.

    Category is unchecked, too, but it shows all my categories in my sitemap.

    Plugin Author RavanH


    Did you remove any static sitemap files from your site root? Did you make sure you have no other sitemap generating plugins active? It does not sound like the sitemap is from my plugin…

    Can you share a link?

Viewing 5 replies - 1 through 5 (of 5 total)
  • The topic ‘[Resolved] Is robots.txt rule working?’ is closed to new replies.
Skip to toolbar