Actually that’s a good thing, it means your visitors are viewing multiple pages.
That’s expected behaviour.
But some of bloggers in our network beginning to think that the statistics are fake (after all they know their readers better!). Maybe reviewing your methods of filtering robots is good.
The robot code has been stable for quite a while, however you may be getting hit by a robot we don’t know about.
You can see more details by looking at the browser stats, if you see a browser that isn’t a real browser in the list then you know you have a robot that we’re not filtering.
Also, some sites need to have the “Coefficient per visitor” setting set to something other than 1 if a site generates a more than one hit to the site.
So the plugin only filters robots according to it’s list not according to their behavior.
Yes, behaviour is too difficult to try and track.