the user agents that are excluded by default are too strict.
E.g. they exclude the Google Bot, which is bad since Google recently included the page speed into their page rank algorithm.
It took me quite a while to find out why google and some performance measuring sites would report my site to be so slooooooow.
I deleted all user agents from the list in my configuration.
A question: Why were the user agents in there in the first place? It seems that all robots can understand gzipped websites, now.