Relevanssi - A Better Search
[resolved] Bot-Logging Problem (w/ tested solution) (4 posts)

  1. justin_k
    Posted 2 years ago #

    Hey Mikko,

    I can't remember if I've already asked for this, so apologies if so.

    Would it be possible to add a filter to the start of relevanssi_update_log()? You might recall that awhile back, I noticed Google Analytics causing every search to be logged twice; I've more recently noticed that there are many other situations that cause similar double-logging. For instance, AddThis (social sharing widgets) do the same thing: soon after a "real" search, their javascript hits the same page, causing it to be logged again. While I'd never expect you to maintain a list of every single useragent that does this, what would be helpful is a filter so that we could disable useragent-based logging as needed:

    if( isset($_SERVER['HTTP_USER_AGENT']) )
        $user_agent = $_SERVER['HTTP_USER_AGENT'];
        $bots = array( 'Google'=>'Mediapartners-Google' );
        $bots = apply_filters( 'relevanssi_bots_to_not_log', $bots );
        foreach ( $bots as $name => $lookfor )
            if ( stristr( $user_agent, $lookfor ) !== false )

    Then I can maintain my own list of bots for exclusion.

    Thanks in advance :)


  2. justin_k
    Posted 2 years ago #

    PS - the reason for the indexed array is to match the format of a similar list used by some other plugins - i.e. wp-useronline, which shows how many 'bots' vs 'real people' are on your site, and a per-post view counter that only logs views from non-bots. By keeping the same array format, the same botlist filter can be applied to Relevanssi's logging as well :)

  3. Mikko Saari
    Plugin Author

    Posted 2 years ago #

    Sure, I can add this.

  4. justin_k
    Posted 2 years ago #

    Thx :)

Topic Closed

This topic has been closed to new replies.

About this Plugin

  • Relevanssi - A Better Search
  • Frequently Asked Questions
  • Support Threads
  • Reviews

About this Topic


No tags yet.