• Resolved iinet


    Is it possible to warm the cache by using something like:

    wget –spider –recursive http://mywordpress.com

    I use this method for other systems to ready my sites with cache files every night when doing backups using a shell script but, when I tried to do this with my WordPress site running quick cache, the cache files did not get created.

    I’m guessing that quick cache checks prevents spiders from caching pages or something by looking at headers. I’m stuck.

    Any advice on how to add a warm cache feature to my bash shell script would be wonderful!



Viewing 4 replies - 1 through 4 (of 4 total)
  • Plugin Author Raam Dev


    I just tested this on my local site and it worked as expected. wget spidered the site, which resulted in Quick Cache generating cache files.

    Quick Cache does not exclude any User-Agents by default (if you’re using Quick Cache Pro then you have access to the User-Agent Exclusion Patterns feature, which includes w3c_validator as a default exclusion, but but even that shouldn’t affect a wget spider).

    If you have the “Discourage search engines from indexing this site” option enabled in WordPress (Settings -> Reading), then WordPress will automatically return a robots.txt file that excludes all User-Agents. (However, if you manually add a robots.txt file, that should override anything set by WordPress.)

    Plugin Author Raam Dev


    Also, since it’s related to your question, I thought I should mention that the next version of Quick Cache Pro includes a new Auto-Cache Engine feature, which basically accomplishes the same thing as “warming the cache”.

    You can see a screenshot of the new Auto-Cache Engine options panel here.

    Do u have an ETA for this version of Quick Cache Pro… This is one feature im eagerly waiting for…

    Plugin Author Raam Dev


    ETA is June 6th, 2014.

Viewing 4 replies - 1 through 4 (of 4 total)
  • The topic ‘Warm Cache’ is closed to new replies.