Name

NotRobotUA — specify user-agents that will NOT be classified as crawler bots (search engines)

SYNOPSIS

useragent_string...

DESCRIPTION

The NotRobotUA directive defines a list of useragent strings which will never be classified as crawler robots (search engines) visiting the site.

This directive has priority over RobotUA. If the user agent matches NotRobotUA, then the check for RobotUA is not performed and the client is not treated as an unattended robot.

DIRECTIVE TYPE AND DEFAULT VALUE

Global directive

EXAMPLES

Example: Defining NotRobotUA

NotRobotUA <<EOR
  *wget*
EOR

NOTES

For more details regarding web spiders/bots and Interchange, see robot glossary entry.

For more details regarding user sessions, see session glossary entry.

AVAILABILITY

NotRobotUA is available in Interchange versions:

4.6.0-5.7.0 (git-head)

SOURCE

Interchange 5.7.0:

Source: lib/Vend/Config.pm
Line 485

['NotRobotUA',     'list_wildcard',      ''],

Source: lib/Vend/Config.pm
Line 3836 (context shows lines 3836-3840)

sub parse_list_wildcard {
my $value = get_wildcard_list(@_,0);
return '' unless length($value);
return qr/$value/i;
}

AUTHORS

Interchange Development Group

SEE ALSO

RobotIP(7ic), RobotUA(7ic)

DocBook! Interchange!