[ic] Setting high RobotLimit ... where to set it?

Bryan D Gmyrek gmyrek at U.Arizona.EDU
Thu Sep 16 16:14:43 EDT 2004


I have a site with over 100,000 products and want to make sure
Google and the other spiders can index them all.  In case several
spiders are spidering the site I want RobotLimit to be high
enough to allow them to hammer, but low enough to stop DoS
attacks.  Any suggestions on a limit?  100, 500, 1000?
The documentation is a bit confusing to me.  It says:
"The RobotLimit directive defines the number of consecutive pages
a user session may access without a 30 second pause. If the limit
is exceeded, the command defined in the Global directive
LockoutCommand will be executed and catalog URLs will be
rewritten with host, sending the robot back to itself.
The default is 0, disabling the check."
So does it mean that you could access 200 pages as long as there
is a 30 second pause between each access or does it mean you can
access 200 pages as long as there is a total of a 30 second pause
when you sum all pauses up.

Bryan Gmyrek

More information about the interchange-users mailing list