WordPress.org

Ready to get started?Download WordPress

Forums

WordPress SEO by Yoast
incompitable vith functions do_robotstxt & do_robots (8 posts)

  1. Umbrovskis.com
    Member
    Posted 3 years ago #

    Just wanted to say since plugin use do_robotstxt & do_robots functions it's not possible to make dynamic robots.txt files.
    While itš possible to produce virtual robots.txt files, because of same functions, all content from robots.txt also attach conent to sitemaps.xml :(

    based on old (2008) hack at http://wpmu.org/wpmu-robotstxt-globally/
    example:

    (...)`
    <url>
    <loc>http://example.com/category/uncategorized/</loc>
    <lastmod>2011-01-12T16:01:00+00:00</lastmod>
    <changefreq>weekly</changefreq>
    <priority>0.2</priority>
    </url>
    <!-- XML Sitemap Generated by Yoast WordPress SEO, containing 10 URLs -->
    </urlset>Disallow: /wp-admin
    Disallow: /wp-includes
    Disallow: /wp-login.php`
    (...)

  2. Umbrovskis.com
    Member
    Posted 3 years ago #

    forgot to say: I'm using WordPress MS 3.1 with subdomains AND domain mapping

  3. KellyAWP
    Member
    Posted 3 years ago #

    I'm also having issues with this, using the my_global_robots function. Using WP 3.1 MU, with subdirectories and domain mapping.

    Sitemaps work fine until I add my global robots file to mu-plugins, then
    accessing sitemaps.xml or gz results in displaying contents of the robots.txt file.

    I've had to disable my global robots for now.

  4. KellyAWP
    Member
    Posted 3 years ago #

    Update:

    Using a custom function, ie add_action('do_robots', 'my_global_robots') appears to works for the sitemap.xml.gz file(gwebmaster still loading) but breaks for the xml page as shown above. In fact it even adds the w3tc cache timestamps at the footer like it's a regular wp page. Not sure if the undefined Disallow: before the rest will have any negative effect but I'd think not.

    Using a custom function plus remove_action('do_robots','do_robots') in order to completely overwrite breaks both.

  5. KellyAWP
    Member
    Posted 3 years ago #

    Apologies but disregard above about xml.gz still working.

    I assumed since the downloaded file looked the same it should be fine, but after checking webmaster tools today it ins't validating for any of my sites. I'm guessing the headers are wrong then, just like how they are wrong for the sitemap.xml, rendering the html as text.

  6. Joost de Valk
    Member
    Plugin Author

    Posted 3 years ago #

    I might have found the issue here, update coming in next release which I hope fixes this. In a future version I'll move away from using the robots hook for my sitemaps anyway, so it's a short term pain.

  7. Umbrovskis.com
    Member
    Posted 3 years ago #

    look like we are on same path, that sitemap will be produced from feeds (according to Your tweet)

  8. Joost de Valk
    Member
    Plugin Author

    Posted 3 years ago #

    yeah my google news module, which i'm beta testing with a small group now, already does that and it works nicely.

Topic Closed

This topic has been closed to new replies.

About this Plugin

About this Topic