• Resolved januzi_pl

    (@januzi_pl)


    Hello,

    I’m trying to figure out why the website is reporting cache miss with the first visit to the pages (mostly the product’s pages, sometimes the categories or the home page). I’ve set up the crawler, but it is marking most of the urls with the blue dot. The thing that comes to my mind is that the box with the recently visited products (that is placed at the bottom of the product’s page) is messing up with the caching engine, but before I’ll try to get rid of it I’d like to ask you for any kind of suggestions.

    The report number is: VBIWOGPJ

    The page I need help with: [log in to see the link]

Viewing 15 replies - 1 through 15 (of 24 total)
  • Plugin Support qtwrk

    (@qtwrk)

    please do me a full screenshot on crawler summary page


    Screens below uploaded. If needed something more or different let me know please.

    https://imgur.com/a/gjofdIh

    • This reply was modified 6 months, 2 weeks ago by kamilles2.
    • This reply was modified 6 months, 2 weeks ago by kamilles2.
    Plugin Support qtwrk

    (@qtwrk)

    one of most common case is cache being purge

    please try , use a small sitemap , like 20 , 30 URLs , manually crawl them , then immediately after crawler finishes , open few of these URLs in private window , see if they give you cache hit or not

    Thread Starter januzi_pl

    (@januzi_pl)

    I’ve tested the sitemap with a few products, home page and a single category. Everything got marked with the green dot (mobile and desktop), the pages were delivered immediately, and the header had the word “hit”.

    I’ve replaced the sitemap with the one that has only products (2000 entries) and this time everything seems to be marked with the blue dot (at least when it comes to the mobile version, it still processing the links so I can’t tell if desktop is also affected).

    • This reply was modified 6 months, 2 weeks ago by januzi_pl. Reason: additional info about bigger sitemap
    Plugin Support qtwrk

    (@qtwrk)

    okay , this should indicate the crawler is working as it should , what you face is more like cache purge takes in place , please enable the purge log , and see if logs tells anything

    Thread Starter januzi_pl

    (@januzi_pl)

    I’ve enabled the full debug mode and re-started the 2000 links crawl. The map says that there are 3 hits and the rest are just miss. The purge log is empty. I’ve picked up one of the “blue” links and checked the indexing log. It had the lines:
    [Router] LSCWP_CTRL bypassed empty
    πŸ”± role id: failed, guest
    πŸ’° X-LiteSpeed-Tag: 34f_product,34f_URL./produkt/plynny-nawoz-do-pelargonii-intensywne-barwy-05-l-target/,34f_Po.3183,34f_

    Then the log has a second “group” of lines with the html comments like:
    Object Cache [total] 13532 [hit_incall] 12693 [hit] 719 [miss_incall] 34 [miss] 86 [set] 75

    And then the log started stating nonsense, because it has two different products:
    X-LiteSpeed-Tag: 34f_product,34f_URL./produkt/plynny-nawoz-do-roz-okazale-kwiaty-1-l-target/,34f_Po.3185,34f_
    Response headers — array (
    ‘7’ => ‘X-LiteSpeed-Tag: 34f_product,34f_URL./produkt/plynny-nawoz-do-pelargonii-intensywne-barwy-05-l-target/,34f_Po.3183,34f_’,

    (maybe the reason for that last thing is that there are different threads trying to add their debug lines to the log and mixing them together?).

    An that’s pretty much it.

    Plugin Support qtwrk

    (@qtwrk)

    no no , blue is okay , it means the page was not cached before but has now been crawled/cached by crawler , you need to wait for some times , then check the “purge log” part

    Thread Starter januzi_pl

    (@januzi_pl)

    Gotcha. The purge log started showing new entries, all of them are being related to the plugin “baselinker” (which is being used to synchronise the price and the amount of the products).
    So, from 3:20PM till 9:00PM there were 100 requests to purge products. I’ve randomly visited several links, checked if they responds slowly, checked if they were purged by the baselinker, and then I’ve compared the link that wasn’t in the purge log with the entries from the debug log. The log says:
    [Core] CHK html bypass: miss footer const
    πŸ’΅ not cacheable before ctrl finalize
    πŸ’° X-LiteSpeed-Cache-Control: no-cache
    πŸ’° Cache-Control: no-cache, no-store, must-revalidate, max-age=0
    but then there is a line
    [Core] Silence Comment due to REST/AJAX
    that looks like ajax request and not like the regular “get”.

    Plugin Support qtwrk

    (@qtwrk)

    now that makes sense, if you have something editing the product , it will lead to purge

    and please share the log , I need to see more detail out of it

    you can share it via https://pastebin.ubuntu.com/

    Thread Starter januzi_pl

    (@januzi_pl)

    The debug logs are here: https://pastebin.ubuntu.com/p/nyhxYDdrSy/ (a visit to the category and to the product, both with the slow response and “miss” header)

    Purge:
    08/04/25 09:47:21.795 [IP2:39430 1 ZMX] Query String:
    08/04/25 09:47:21.795 [IP2:39430 1 ZMX] User Agent: BaseLinker/1.0
    08/04/25 09:47:21.795 [IP2:39430 1 ZMX] Accept: /
    08/04/25 09:47:21.795 [IP2:39430 1 ZMX] Accept Encoding: gzip
    08/04/25 09:47:21.795 [IP2:39430 1 ZMX] X-LSCACHE: true
    08/04/25 09:47:21.795 [IP2:39430 1 ZMX] X-LiteSpeed-Purge: public,34f_Po.4897,34f_URL./produkt/mix-salat-na-tasmie-kiepenkerl/,34f_W.recent-posts-1,34f_T.252,34f_T.2,34f_T.97,34f_T.93,34f_T.251,34f_FD,34f_A.1,34f_PT.product,34f_product,34f_F,34f_H,34f_PGS,34f_PGSRP,34f_D.202506,34f_REST => LiteSpeed\LSC->send_headers()@608 => WP_Hook->apply_filters(ARRAY)@324 => WP_Hook->do_action(ARRAY)@348 =>
    /public_html/wp-includes/load.php@517

    Plugin Support qtwrk

    (@qtwrk)

    the link is to the page log , not contain anything to the purging

    the purge los is kind of partial , there should be 1 line before it to show the main request , and do you know what is this BaseLinker ?

    BaseLinker is an external system we use to manage products, orders, and inventory across multiple sales channels. It’s connected to our WooCommerce store through a plugin, and it keeps everything in sync automatically β€” especially product availability and pricing.

    Thread Starter januzi_pl

    (@januzi_pl)

    You’re right, I didn’t notice that this forum cut out that line. It’s just
    PUT HTTP/1.1 (HTTPS) /wp-json/wc/v3/products/5866
    I can’t see anything that would have “purge_all” in the request.

    Does this mean that the server isn’t storing the cache of the pages?

    Plugin Support qtwrk

    (@qtwrk)

    this actually makes sense now , when you have something updated/modified a product or so , it will need to purge cache to reflect the change

    so if you have something constantly updating product in background , it will keep purging cache when it updates

    Thread Starter januzi_pl

    (@januzi_pl)

    Yes, however there are just ~100-200 requests per day in the purge log. It shouldn’t purge everything in the cache. Just those 100 products + the category those are being attached to.

Viewing 15 replies - 1 through 15 (of 24 total)

The topic ‘Slow pages on the first visit, even with crawler enabled’ is closed to new replies.