The article that @korba12 linked to clearly shows that some companies are motivated by malice and not simply incompetence.
I was reading that and one thought crossed my mind to get rid of the problem which is: for a human it is difficult to visit more than 3 pages per second (implies good response time and someone who reads reasonably fast.)
Presuming there are 500 real humans reading the forum and they are all "speed-pagers" (new term, I'm going to patent it

), that means the server would experience a maximum of 1500 requests per second. That's still a fair amount but, once the server sees that load then it could _require_ the user to be _logged in_ to honor the request. No log in = request ignored.
This would cause the number of requests to drop rapidly and, the rule could be implemented for a lower number of requests per second, e.g, 500 to keep response time reasonably speedy at all times.
Since bots rarely log in, that could be a first line of defense. Known bots that behave as they should could be issued a user and password by the administrator..
Just thinking out loud but, there has to be a way to insert long delays on non-human activity.