WP Super Cache
Why would you not want BOTS to cause caching (3 posts)

  1. Strictly Software
    Posted 2 years ago #

    Just a question about the option in the Admin options under the title "Rejected User Agents" which says

    Strings in the HTTP ’User Agent’ header that prevent WP-Cache from caching bot, spiders, and crawlers’ requests. Note that super cached files are still sent to these agents if they already exists.

    And contains these default strings


    Can I ask why you wouldn't want these BOTS to cause cached files to be created as surely if a new post is generated and a BOT is the first "user" to visit the page (Especially if you post to Twitter as you get a Twitter Rush of about 50+ BOTS hitting your site as soon as the Tweet appears (see http://blog.strictly-software.com/2011/11/twitter-rush-caused-by-tweet-bots.html)

    I was going to add a feature into my own Twitter plugin Strictly TweetBot that would make a call to any new post/page before posting to Twitter so that when the Rush comes they get served cached pages and don't all hit the server at the same time driving up CPU usage etc.

    Therefore I am wondering why you wouldn't want BOTS to cause a cached page to be generated so that it creates them for real users?



  2. Meeker
    Posted 2 years ago #

    I am wondering this also.

  3. jondaley
    Posted 2 years ago #

    I've wondered that too, but from some comments elsewhere e.g. "keeping cached files on disk is 'expensive'", my guess is that people have limited disk space, and so don't want to have posts cached that only bots visit.

    I always erase everything out of the user-agent box, and preload all posts, so everything is cached for everyone.

Topic Closed

This topic has been closed to new replies.

About this Plugin

  • WP Super Cache
  • Frequently Asked Questions
  • Support Threads
  • Reviews

About this Topic