[ic] IC not responding

John1 list_subscriber at yahoo.co.uk
Tue Nov 15 06:39:51 EST 2005


On Tuesday, November 15, 2005 1:58 AM, rphipps at reliant-solutions.com wrote:

>> From: interchange-users-bounces at icdevgroup.org
>> [mailto:interchange-users- bounces at icdevgroup.org] On Behalf Of John1
>> Sent: Monday, November 14, 2005 4:06 PM
>>
>> On Monday, November 14, 2005 7:12 PM, racke at linuxia.de wrote:
>>
>>> Ron Phipps wrote:
>>>>> From: interchange-users-bounces at icdevgroup.org
>>>>
>>>> [mailto:interchange-users-
>>>>
>>>>> bounces at icdevgroup.org] On Behalf Of jeff at hisgirlfridays.com
>>>>> Sent: Friday, November 11, 2005 11:24 AM
>>>>>
>>>>> On Fri, Nov 11, 2005 at 08:40:30AM -0800, Ron Phipps wrote:
>>>>>
>>>>>
>>>>>> We've got a weird issue going on right now where all of a sudden
>>>>>> IC will
>>>>>> stop servicing requests and utilization of the server goes to
>>>>>> just about
>>>>>> 0.  If I restart IC everything goes back to normal.  This has
>>>>>> gone on for the last 3 days at different points in the day.  The
>>>>>> ip address
>>>>>
>>>>> The next time this happens, restart apache instead of IC and tell
>>>>> me if it goes back to normal.  If it does, then I have more to
>>>>> say about the issue and also have a potential fix.
>>>>>
>>>>> Jeff
>>>>
>>>>
>>>> Unfortunately this was not the case.  The potential fix of using
>>>> apache's MaxRequestsPerChild 1000-5000 did not seem to help.
>>>>
>>>> Does anyone know of a way to troubleshoot IC, perhaps with debug
>>>> statements to see why it can no longer handle requests?  IC is
>>>> still running, I can see all the rpc servers, however they do not
>>>> take up much cpu usage and it's almost like they are not even
>>>> there. Apache/mod_interchange does not error out and a restart of
>>>> IC instantly allows Apache to start handling requests again.
>>>
>>> First of all you should try to strace all the IC processes to see if
>>> it does system calls and watch your logfiles (IC and system
>>> logfiles) as well. If no system calls happened it might caught up
>>> in an infinite loop somewhere.
>> Do you know if you are bumping up against your Apache MaxClients
>> setting (in
>> httpd.conf)?
>>
>> i.e. Can you see the following entry in your Apache error.log when
>> the problems start?
>> "server reached MaxClients setting, consider raising the MaxClients
>> setting"
>>
>> Just a possibility, as we are having a similar problem (but may be
>> unrelated
>> to yours) caused by bumping up against Apache MaxClients (which I
>> have just
>> posted to mailing list as "mod_interchange and Apache MaxClients").
>>
>
> Thanks for this information, I was looking in the error log for the
> particular vhost and had not looked in the apache site wide error log,
> and sure enough I see each time the server "went down" the max clients
> reached error.  I'm currently at 256 max clients, I'm going to raise
> that to 1024 tonight after I recompile apache.  We've also noticed
> during the times this issue has occurred that we are being crawled by
> search engines and also have scripts trying to look for holes in our
> apache install.
>
> Thanks for your help!
> -Ron
>
>From reading around on the web, it would seem that MaxClients 150 ought to 
be enough even for moderately busy web servers.  Does anyone else have a 
view on this?  Ron, I can understand your idea of raising MaxClients to help 
the problem, (although I can't try this myself as my Virtual Server 
environment restricts the number of concurrent processes I can run).

However, I am sure you will agree that raising MaxClients is really a work 
around rather than a solution to the underlying problem.  I can only presume 
that this problem of scripts looking for security holes hammering websites 
must be a problem common to every website on the Internet, and therefore 
presumably every website is bumping up against MaxClients on a regular 
basis!?  I presume that most websites come back to life as soon as the 
script robot goes away, but even so, all websites will be brought down by 
this sort of robot for the duration of its visit.  So, to get to my point, 
are there any easy ways to stop these script robots racking up the 
MaxClients count in the first place.  i.e. is there anything in Apache or 
Apache modules that can be used to spot these "robot attacks" and drop their 
requests before they cause Apache to spawn loads of processes?  I'd be 
grateful for any ideas, especially as upping the MaxClients setting is not 
really an option for me.  Thanks 


	
	
		
___________________________________________________________ 
Yahoo! Messenger - NEW crystal clear PC to PC calling worldwide with voicemail http://uk.messenger.yahoo.com


More information about the interchange-users mailing list