Re: usage stats

William M. Perry (
Tue, 20 Apr 1993 22:02:07 -0500

Marc Andreessen writes:
> writes:
>> >Over the three (or is it four) month period from December '92 to March
>> >'93, gopher usage increased by 65%. If you think that sounds
>> >impressive, wait -- WWW usage increased by 2800%. Of course, the
>> >relative operational ages of the systems should be taken into account,
>> >but it still sounds pretty darn impressive :)
>> Note one thing however -- I looked at my usage stats the other day and
>> found they had increased trmendously. However, close inspection of
>> the log file showed that around April 3-6 a mail robot at
>> had been active, multiplying my load from an
>> average twenty to over a thousand requests per day...
>OK, hypothetical question time.
>Suppose we had developed a new net scanning tool that consisted of a
>daemon that can run independently of user guidance and a Motif-based
>GUI that can attach/detach to/from the daemon at any time and allow a
>user to interactively or asynchronously guide the daemon's progress
>through the information space on the global network.
>Suppose further, and more specifically, that this front-end was really
>really impressive (also suppose I didn't write it, so I could
>theoretically say that without sounding pompous) and included
>interactive drag/drop/point/click graphical maps of the information
>space the daemon traversed, as well as direct control over a variety
>of constraint parameters to control the daemon's progress both in real
>time and asynchronously (e.g., you can tell it where to look and what
>to look for interactively, and then detach and let it run
>asynchronously overnight, and then reattach in the morning to discover
>what it's found).
>If we had theoretically developed such a hypothetical tool, should we
>release it to the net? Or are experiences similar to Guido's above,
>except on a much larger scale, liable to have severely deleterious
>effects on the network and the servers currently on it, and should we
>therefore not release it?

Woooaaah ... this could be a problem. :) The stats on our server went
from about 500 a day or so, to over 6000 in a 5 hour stretch when your bot
was roaming (didn't help that it seemed to jump into the man page link).
This could get ridiculous if people at 20 sites decided to let this bot
loose at once.

How about:

1. Have 2 or 3 sites that will have the bot - presumably major WWW
sites (cern, ncsa, ??). They can run their bots once a month,
or whenever a new site opened, they could aim it there and let
it rip.
2. Extend the Motif interface so that it can connect to a remote
machine (if it can't already). Also tone it down a bit, so
people not on the machine where its running can't point it in
new areas or control its movement. Have the main sites put up
the data they collect on some sort of toned-down server, and let
people look at the graphical representation of 'WEB-Space'(tm)
3. Release the toned-down motif interface to everyone

Would this be possible the way this hypothetical server/interface are
right now? Or is this a dumb idea?

Also - how would(does) this display the 'map' of the information it
finds? Just curious.

>You be the judge.........

-- William M. Perry ( --
He was a cowboy, mister, and he loved the land. He loved it so much he
made a woman out of dirt and married her. But when he kissed her, she
disintegrated. Later, at the funeral, when the preacher said, "Dust to
dust," some people laughed, and the cowboy shot them. At his hanging,
he told the others, "I'll be waiting for you in heaven--with a gun."