DB at M-and-D.com
Wed Sep 15 20:44:58 UTC 2010
> On 09/15/2010 02:23 PM, DB wrote:
>> My catalog's access log has many entries like the following which appear
>> to be caused by spiders. They don't seem to be much of a problem, but
>> I'm curious what these entries mean. Is it a failed search, for example
>> for an expired session, or maybe something to do with spiders not having
>> a session assigned?
>> 220.127.116.11 nsession:18.104.22.168 - [15/September/2010:12:14:38
>> -0400] store
>> /cgi-bin/store/scan/MM=f6bfbefe9ff41ee8d877f715bb157e03:5500:5549:50.html search
>> error: Object saved wrong in
>> /catroot/tmp/n/nsession.f6bfbefe9ff41ee8d877f715bb157e03 for search ID
> scan/MM links are only meaningful for a certain session. Just disallow
> robots to follow these.
> LinuXia Systems => http://www.linuxia.de/
> Expert Interchange Consulting and System Administration
> ICDEVGROUP => http://www.icdevgroup.org/
> Interchange Development Team
Thanks. I don't want to disallow all scan links since I have some valid
ones in various search engines. You're saying that this should work:
in my robots.txt ?
More information about the interchange-users