It’s likely bot traffic, something that Google filters out from the analytics. If you want a better idea of this, look for search queries in server access logs. If you can identify bots that do a lot of searches, you can use the relevanssi_bots_to_not_log filter hook to filter out the bot traffic from the stats.
I understand.
Almost all search words are legit though. I thought bots would generate weird search strings. What is your experience regarding that?
I haven’t seen a lot of spam searches, but I’ve seen a lot of bot traffic that looks kind of legitimate but still very random. But it doesn’t look like spam and there’s a lot of it. Looking at the access log will tell you what it is.
The Predictive search plugin was activated in the backend but not implemented in the frontend.
This seems to have given the effect that it messed up the search statistics. In the backlog for instance, a search for ”Searchstring” gave multiple search hits like this:
”Sea”
”Sear”
”Searc”
”Search”
And so on.
Yes, that’s what that kind of searching does. Unfortunately, there’s nothing much that can be done about this. For Relevanssi, each individual search is independent and Relevanssi can’t know which one’s the last one. Logging and live search is not a great combination.
One solution I’ve been thinking about is disabling logging when Relevanssi Live Ajax Search is used and then having Relevanssi Live Ajax Search trigger logging when the user interacts with the search – but that, of course, would only work with Relevanssi Live Ajax Search, and it would mean the unsuccessful searches are not logged at all, so that’s not a perfect solution either.
If you’re seeing lots of these kinds of searches, one thing that may help is adjusting the search delay in the live search plugin, if that’s possible. If there’s a longer delay between typing and triggering the search, you’d see fewer of these intermediate searches. That would save server resources.
I understand. No worries. Thank you so much for taking the time to reply.