[ic] search engine indexing scan/ MM=0f73bb47ac44f4e422.....

Jamie Neil jamie at versado.net
Sat Jun 17 04:00:47 EDT 2006

Kevin Walsh wrote:
>     1. Recognise spiders and don't paginate the list for them.  Set
>        ml=999999, or whatever seems large enough.

This is the way we've been doing it and it seems to work very well.

However it's worth noting that if your pages end up too large (I'm not
sure what the limit is, but I think it's around 200Kb) then Googlebot
may ignore everything after a certain point. We just make sure that our
categories aren't too large and the HTML on the search results pages is
as compact as possible.

Snippet from an ActionMap:

    if ($Session->{spider})
        $CGI->{mv_matchlimit} = '100';
        $CGI->{mv_max_matches} = '100';
        $CGI->{mv_matchlimit} = '10';

Jamie Neil | <jamie at versado.net> | 0870 7777 454
Versado I.T. Services Ltd. | http://versado.net/ | 0845 450 1254

More information about the interchange-users mailing list