[ic] Re: interchange-users Digest, Vol 47, Issue 18

Carl Bailey carl at carlbailey.net
Thu May 24 12:43:21 EDT 2007

On May 24, 2007, at 12:00 PM, kevin at cursor.biz wrote:

> Date: Wed, 23 May 2007 19:24:40 +0100
> From: Kevin Walsh <kevin at cursor.biz>
> Subject: Re: [ic] Rolling big tables (mysql)
> To: interchange-users at icdevgroup.org
> Message-ID: <200705231924.41590.kevin at cursor.biz>
> Content-Type: text/plain;  charset="iso-8859-1"
> Grant <emailgrant at gmail.com> wrote:
>>>> I do keep a separate table of robot UAs and match traffic rows to  
>>>> them
>>>> with op=eq to populate another table with robot IPs and non-robot  
>>>> IPs
>>>> for the day to speed up the report.  Don't you think it would be
>>>> slower to match/no-match each IC request to a known robot UA and  
>>>> write
>>>> to the traffic table based on that, instead of unconditionally  
>>>> writing
>>>> all requests to the traffic table?  If not, excluding the robot
>>>> requests from the traffic table would mean a lot less processing for
>>>> the report and a lot fewer records for the traffic table.
>>> Perhaps you should create a column called "spider" in the traffic  
>>> table
>>> and save a true or false value depending upon the [data session  
>>> spider]
>>> value.  You can then generate reports "WHERE spider = 0", for  
>>> ordinary
>>> users, or "WHERE spider = 1" for robots etc.  An index on the spider  
>>> column
>>> would be nice, of course.
>> I let this roll around in my head for quite a while and I ended up
>> writing the IC page accesses to my traffic table based on [data
>> session spider] like you suggested.  This should mean a much smaller
>> traffic table and less processing when running a report on it.  We'll
>> see how much time it buys me before running the report takes too long
>> again.  I also need to set up indexes.
> Also, you may as well grab the latest robots.cfg file from CVS and
> "include" it into your interchange.cfg file.

... and where would that be in CVS?  I read the MANIFEST and it cited  
only /debian/robot.cfg but the file is not present here:

I guess either the file is missing or the manifest is wrong.

Grateful if you can point me in the right direction.

- - - - - - - - - - - - -
  Carl Bailey
  Triangle Research, Inc.
  tel: 919.323.8025
- - - - - - - - - - - - -

More information about the interchange-users mailing list