Redis latency grows to 27s+ after +-24h (Lua/EVAL blocking?)
-
Hi all,
I’m running into a performance issue with Redis Object Cache on a WooCommerce site and wanted to check whether this is a known behavior or something I should configure differently.
After ~24 hours of uptime, Redis command latency (measured via slowlog and socket wait) increases dramatically. I’ve observed Redis operations taking up to 37,000 ms. When this happens, wp-admin becomes extremely slow (product edit pages can take tens of seconds). Flushing the Redis cache immediately resolves the issue, but it slowly returns over the next day.
My environment: Redis via UNIX Socket, NGINX server, Large webshop with many transients and object cache entries (10K products, 10 attributes each, wp import running twice a day from multiple suppliers).
Redis has ~200k keys after running for some time.
The largest key iswp:options:alloptions(~1.4 MB).
There is also a growing zset:wp:redis-cache:metrics(thousands of members).
RedisSLOWLOGconsistently shows very long-runningEVALcommands (300–340 seconds cumulative runtime across calls).
Redis is otherwise healthy (no swapping, sufficient memory).Total key length in bytes is 40278387 (avg len 115.69)
Biggest string found "msOrZN^=;#0M<*cwh=N/_=o8uH#~qF%, rdMPe)F5f XS/Y4&A$NBa}?;;5=PT-rwp:options:alloptions" has 1458280 bytes
Biggest zset found "msOrZN^=;#0M<*cwh=N/_=o8uH#~qF%, rdMPe)F5f XS/Y4&A$NBa}?;;5=PT-rwp:redis-cache:metrics" has 1540 members
0 lists with 0 items (00.00% of keys, avg size 0.00)
0 hashs with 0 fields (00.00% of keys, avg size 0.00)
0 streams with 0 entries (00.00% of keys, avg size 0.00)
348152 strings with 244424308 bytes (100.00% of keys, avg size 702.06)
0 sets with 0 members (00.00% of keys, avg size 0.00)
1 zsets with 1540 members (00.00% of keys, avg size 1540.00)Status: Verbonden
Client: PhpRedis (v6.2.0)
Drop-in: Valid
Disabled: No
Ping: 1
Errors: []
PhpRedis: 6.2.0
Relay: Not loaded
Predis: 2.4.0
Credis: Not loaded
PHP Version: 8.1.33
Plugin Version: 2.7.0
Redis Version: 8.2.0
Multisite: No
Metrics: Enabled
Metrics recorded: 8692
Filesystem: Writable
Global Prefix: "wp_"
Blog Prefix: "wp_"
Timeout: 10
Read Timeout: 10
Retry Interval:
WP_REDIS_SCHEME: "unix"
WP_REDIS_PATH: "/home/ANONYMIZED/.redis/redis.sock"
WP_REDIS_TIMEOUT: 10
WP_REDIS_READ_TIMEOUT: 10
WP_REDIS_MAXTTL: 3600
WP_REDIS_PREFIX: "msOrZN^=;#0M<*cwh=N/_=o8uH#~qF%, rdMPe)F5f XS/Y4&A$NBa}?;;5=PT-r"
WP_CACHE_KEY_SALT: "msOrZN^=;#0M<*cwh=N/_=o8uH#~qF%, rdMPe)F5f XS/Y4&A$NBa}?;;5=PT-r"
WP_REDIS_PLUGIN_PATH: "/home/ANONYMIZED/domains/ANONYMIZED/public_html/wp-content/plugins/redis-cache"
WP_REDIS_IGNORED_GROUPS: [
"wp_all_import",
"wp-all-import-pro",
"wp-all-import-pro"
]
Global Groups: [
"blog-details",
"blog-id-cache",
"blog-lookup",
"global-posts",
"networks",
"rss",
"sites",
"site-details",
"site-lookup",
"site-options",
"site-transient",
"users",
"useremail",
"userlogins",
"usermeta",
"user_meta",
"userslugs",
"redis-cache",
"blog_meta",
"image_editor",
"network-queries",
"site-queries",
"theme_files",
"translation_files",
"user-queries",
"code_snippets",
"woo_variation_swatches"
]
Ignored Groups: [
"wp_all_import",
"wp-all-import-pro",
"counts",
"plugins",
"theme_json",
"themes"
]
Unflushable Groups: []
Groups Types: {
"blog-details": "global",
"blog-id-cache": "global",
"blog-lookup": "global",
"global-posts": "global",
"networks": "global",
"rss": "global",
"sites": "global",
"site-details": "global",
"site-lookup": "global",
"site-options": "global",
"site-transient": "global",
"users": "global",
"useremail": "global",
"userlogins": "global",
"usermeta": "global",
"user_meta": "global",
"userslugs": "global",
"redis-cache": "global",
"wp_all_import": "ignored",
"wp-all-import-pro": "ignored",
"blog_meta": "global",
"image_editor": "global",
"network-queries": "global",
"site-queries": "global",
"theme_files": "global",
"translation_files": "global",
"user-queries": "global",
"counts": "ignored",
"plugins": "ignored",
"theme_json": "ignored",
"code_snippets": "global",
"themes": "ignored",
"woo_variation_swatches": "global"
}
Drop-ins: [
"Redis Object Cache Drop-In v2.7.0 by Till Krüss"
]I understand that Redis keys can increase drastically when you have many products, attributes, filters, and users in a WooCommerce webshop. That part makes sense to me. However, I find it hard to believe that I am the only one running into this issue. It feels like something is going wrong structurally rather than just normal growth.
My knowledge of Redis is not deep enough to fully understand the root cause. An LLM suggested that I should look into Lua usage, but this is a new concept for me and I am not sure how to interpret that advice in the context of Redis Object Cache.
I would like to ask if anyone here has experienced something similar or can provide some guidance. Any insights or shared experiences would be greatly appreciated.
You must be logged in to reply to this topic.