Enabled object cache doesn’t allow to filter robots meta
-
Hi there!
We’re using The SEO Framework on a bunch of websites with active domain mapping. Thus, we want to allow indexing on the mapped domain but not on the actual domain in WordPress.
To do that, we have to change the value of the robots meta in the frontend and did it this way:
/** * Set robots to noindex for base URLs for The SEO Framework. * * @since 1.5.0 * @since 1.5.2 Disable on admin for settings default to index * @version 1.0.1 * * @param array $meta The meta attributes * @return array The updated meta attributes */ function my_the_seo_framework_robots_meta_array( $meta ) { // do nothing on non-internal domains or in admin if ( ! is_base_url() || is_admin() ) { return $meta; } $meta['noindex'] = 'noindex'; $meta['nofollow'] = 'nofollow'; return $meta; } add_filter( 'the_seo_framework_robots_meta_array', 'my_the_seo_framework_robots_meta_array' );
To improve performance, we added object caching to our sites. Unfortunately, this breaks the solution above, since these filters are no more active if object caching and an object cache entry is available. The object cache value usually gets set in the backend, so we cannot change the value for the frontend.
Thus, I would love to see a filter after the cached value has been received in order to change it according to
is_admin
.As a workaround, I now have to buffer
wp_head
and change the robots meta string inside it, which is a dirty fix.
- You must be logged in to reply to this topic.