[ic] IC not responding

Sandy Thomson sandy at scotwebshops.com
Tue Nov 15 09:33:35 EST 2005


John1 wrote:

> From reading around on the web, it would seem that MaxClients 150 
> ought to be enough even for moderately busy web servers.  Does anyone 
> else have a view on this?  Ron, I can understand your idea of raising 
> MaxClients to help the problem, (although I can't try this myself as 
> my Virtual Server environment restricts the number of concurrent 
> processes I can run).
>
> However, I am sure you will agree that raising MaxClients is really a 
> work around rather than a solution to the underlying problem.  I can 
> only presume that this problem of scripts looking for security holes 
> hammering websites must be a problem common to every website on the 
> Internet, and therefore presumably every website is bumping up against 
> MaxClients on a regular basis!?  I presume that most websites come 
> back to life as soon as the script robot goes away, but even so, all 
> websites will be brought down by this sort of robot for the duration 
> of its visit.  So, to get to my point, are there any easy ways to stop 
> these script robots racking up the MaxClients count in the first 
> place.  i.e. is there anything in Apache or Apache modules that can be 
> used to spot these "robot attacks" and drop their requests before they 
> cause Apache to spawn loads of processes?  I'd be grateful for any 
> ideas, especially as upping the MaxClients setting is not really an 
> option for me.  Thanks


Try mod_evasive, or use the robot blocking functionality in interchange, 
in conjunction with iptables, to drop packets from the malicious IP.  
Most common robots should not kill your webserver.

Sandy.


More information about the interchange-users mailing list