Support » Plugin: Relevanssi - A Better Search » Bot-Logging Problem (w/ tested solution)

  • Resolved justin_k


    Hey Mikko,

    I can’t remember if I’ve already asked for this, so apologies if so.

    Would it be possible to add a filter to the start of relevanssi_update_log()? You might recall that awhile back, I noticed Google Analytics causing every search to be logged twice; I’ve more recently noticed that there are many other situations that cause similar double-logging. For instance, AddThis (social sharing widgets) do the same thing: soon after a “real” search, their javascript hits the same page, causing it to be logged again. While I’d never expect you to maintain a list of every single useragent that does this, what would be helpful is a filter so that we could disable useragent-based logging as needed:

    if( isset($_SERVER['HTTP_USER_AGENT']) )
        $user_agent = $_SERVER['HTTP_USER_AGENT'];
        $bots = array( 'Google'=>'Mediapartners-Google' );
        $bots = apply_filters( 'relevanssi_bots_to_not_log', $bots );
        foreach ( $bots as $name => $lookfor )
            if ( stristr( $user_agent, $lookfor ) !== false )

    Then I can maintain my own list of bots for exclusion.

    Thanks in advance 🙂

Viewing 3 replies - 1 through 3 (of 3 total)
  • PS – the reason for the indexed array is to match the format of a similar list used by some other plugins – i.e. wp-useronline, which shows how many ‘bots’ vs ‘real people’ are on your site, and a per-post view counter that only logs views from non-bots. By keeping the same array format, the same botlist filter can be applied to Relevanssi’s logging as well 🙂

    Plugin Author Mikko Saari


    Sure, I can add this.

    Thx 🙂

Viewing 3 replies - 1 through 3 (of 3 total)
  • The topic ‘Bot-Logging Problem (w/ tested solution)’ is closed to new replies.