WordPress.org

Ready to get started?Download WordPress

Forums

Autoptimize
[resolved] Cache size when optimising JS (5 posts)

  1. fotofashion
    Member
    Posted 1 year ago #

    Hi Frank!

    First of all... thanks a lot for the update! The JS exclusion is a great feature. I figured out what scripts to exclude and can now use the optimisation without breaking anything. Cool!

    But I noticed that the plug-in creates one cached file per page, although the pages should be using the same scripts. Not sure whether this is what it's supposed to be. Same when "look for scripts in only <head>" is enabled.

    Example: I have a portfolio page with 48 images. Each image can be opened in it's own page. The content is all identical, except for the image. And every time I open a new page a new cache file gets generated. Is that what you would expect?

    Regards,
    Andreas

    http://wordpress.org/extend/plugins/autoptimize/

  2. futtta
    Member
    Plugin Author

    Posted 1 year ago #

    Well, it's pretty simple actually; as soon as there a difference of only 1 character, the hash of the aggregated JS does not match the existing ones, so a new cache-file is created. Specifically for your site, e.g. in this line

    var dt_ajax = {"ajaxurl":"http:\/\/fotoandfashion.de\/wp-admin\/admin-ajax.php","tax_kboom":"5721"};

    the tax_kboom value is different for each and every page, which will cause a new cache-file being created. If you'd find all javascript which behaves that way and exclude that from being cached, you should have a lot less files in wp-content/cache/autoptimize.

    Hope this helps!

  3. fotofashion
    Member
    Posted 1 year ago #

    Thank you for the swift reply! Apart from the size of the cache, is there a reason for me not to use JS optimisation? I mean, will it be less efficient if I will end up having one cache file per page? I am using Autoptimize in combination with Quick Cache.

  4. futtta
    Member
    Plugin Author

    Posted 1 year ago #

    If you assume a random visitor seems more then one blogpost, then yes, having a seperate aggregated JS file per blogpost is inefficient if you consider 99% of the aggregated file will be identical. So yes, in that case you would be better off not aggregating JS at all (although that should be proven with hard numbers based on tests rather then just being based on "an educated guess").

    The optimal solution would be for you to try to exclude all javascript that is unique for a page from being aggregated by Autoptimize. For example I think that when you add "tax_kboom" to the list of JS to be excluded, the entire block of inline JS will be left as is.

    A bit of (trial and error) work, but no-one said performance optimization would be easy ;-)

  5. fotofashion
    Member
    Posted 1 year ago #

    Wow! That did the trick. Frank, you are my hero! ;)

Topic Closed

This topic has been closed to new replies.

About this Plugin

About this Topic

Tags

No tags yet.