Re: benchmark program for HTTP server's

Robert S. Thau (rst@ai.mit.edu)
Wed, 1 Mar 1995 11:44:34 +0500


Date: Wed, 1 Mar 1995 08:28:07 +0500
Reply-To: marsj@ida.liu.se

[the original mail was lost during info.cern.ch downtime]
Hi,

i would like to profile my www server (or rather the CGI client),
and i'm looking around for a benchmark programs/robot which would
automate the process. Basicly, i'm looking around fore some software
which:

1. take a base URL
2. retrieves all URL in the base document, but
3. do not goes outside the server (e.g. restrict the set of
allowed URL),
4. minimum time between HEADs/GETs,
5. runs under unix (preferable SunOS 4.1 - i have ported software
to hp-ux/solaris 2.x/dec osf/4.3bsd/aix/ultrix/sgi/linux)

I've written a logfile replay program which may be of some use to you
--- it was written on SunOS, and *ought* to be fairly portable (though
I understand that you've got to work a bit to find the right header
files on AIX). The program takes a log in Common Log Format, and
replays the GET transactions at a user-specified rate. (It's capable
of having multiple transactions open simultaneously, and in fact
that's the common case, up to a user-specified maximum --- NB if that
maximum exceeds the maximum number of file descriptors a single
process has open, the program will croak).

Every 100 transactions, the program reports the mean latency for those
100 transactions, and the cumulative mean latency. (Things get a
little hairier if it can't initiate transactions because more than the
user-set maximum were in progress).

The program has some limitations --- it doesn't try to replay POSTs
(there isn't enough information logged to do it properly), and it
doesn't deal with HTTP authentication (so it's not useful at sites
which make heavy use of that feature). However, some people at other
sites have gotten good service out of it.

If you want to look it over, see ftp://ftp.ai.mit.edu/pub/users/rst/monkey.c

rst