Hello!
As you found, it is intentional behavior, for WordPress trickles taxonomies down from post types. Legacy support also plays a role in why we chose it to behave as is.
To achieve what you wish, we need to approach this backwardly. The gist is that we apply noindex as late as possible as an override. This way, we won’t accidentally undo any protection for the taxonomies by removing the noindex tag.
So, to achieve what you wish, it’s best to uncheck the noindex option for the post type.
Then, apply noindex to every non-archive (i.e., singular) type via a filter:
add_filter( 'the_seo_framework_robots_meta_array', function( $meta, $args, $ignore ) {
$tsf = the_seo_framework();
if ( null === $args ) {
$force_noindex = $tsf->is_singular()
&& 'product' === $tsf->get_current_post_type()
&& (
( $ignore & \The_SEO_Framework\ROBOTS_IGNORE_SETTINGS ) // if this is true, then ignore settings.
|| $tsf->get_post_meta_item( '_genesis_noindex' ) > -.33
);
} else {
$force_noindex = empty( $args['taxonomy'] )
&& 'product' === $tsf->get_post_type_real_ID( $args['id'] )
&& (
( $ignore & \The_SEO_Framework\ROBOTS_IGNORE_SETTINGS ) // if this is true, then ignore settings.
|| $tsf->get_post_meta_item( '_genesis_noindex', $args['id'] ) > -.33
);
}
$force_noindex and
$meta['noindex'] = 'noindex';
return $meta;
}, 10, 3 );
(Public gist)
I tested the filter, and it works perfectly: The robots-metatags output as expected, sitemap entries are cleared as expected, the SEO Bar recognizes it, and it also listens to force-index settings on a per-product basis if you so desire (at product-edit: Visibility settings dropdown, select index). Even when applying force-index to a product, the default non-override-state still realizes it should be noindex. That quirk is why I added that $ignore-blob. Up to my standards, I must say 🙂
I hope this helps. Cheers!