[Resolved] Slowing down requests raises the risk of exhausting Apache' s request limit
(I know that the topic of whether it is a good practice to slow down requests that come from brute force attempts or not, has been discussed many times before. However, I believe have something to add to it.)
First of all, I find it quite logical that one chooses to slow down these requests. The problem with this is that, by slowing them down, the requests are kept alive much longer than usual and it is very easy to reach the maximum number of requests allowed by your Apache configuration! And this will be perceived by your users as if the server is actully not responsive!
Maybe it would be better to just stop these requests immediately when they are detected. Of course, then you risk your server actually receiving a proper DDoS attack. Well, we have actually tried that ourselves in many installations (more than a 100 really) of our clients lately, by tinkering with the plugin’s code. And we have stopped a lot of attacks successfully (even as we speak), without suffering the results of a DDoS attack. Maybe the attacks were too distributed!
I would like to repeat it once more: slowing down the requests and reaching your sever’s request limit is actually equivalent of a DDoS attack from the users’ point of view.
So, why argue? Maybe we could add an option to the plugin so that the administrator chooses which mode they wish to enforce: a) slow down brute force requests or b) immediately stop them.
(Of course we already do have our own version of the plugin, but it would be nicer to have only one!)
- The topic ‘[Resolved] Slowing down requests raises the risk of exhausting Apache' s request limit’ is closed to new replies.