Getting Together Bit by Bit

High-Speed Digital Science Networks

Joseph Palca
Science

April 13, 1990

LAST MONTH, SCIENTISTS got a peek at the future. On a color monitor set up in a hotel room in Washington, a computer-generated image of an aluminum-gallium-arsenide crystal rotated slowly, turning in response to commands from an operator seated next to the monitor. What made the demonstration remarkable was that the computer generating the image was located in Ann Arbor, Michigan. The graphic data was traveling along a prototype implementation of NSFNET--the national science network--running nearly 30 times faster than the network's current top speed.

The slowly rotating picture may be the harbinger of a revolution in the way all scientists, not just computer engineers and physicists, will conduct their research. High-speed electronic networks will bring new techniques into routine use in research. Among them: transporting the power of supercomputer graphics directly to the lab bench and providing instantaneous access to the huge databases that will be amassed in the human genome project or studies of global change. And that will change science in a variety of ways, perhaps the most dramatic being to remove physical barriers and permit collaborations on a scale that has never before been possible.

And those are just the obvious implications. "We're going to be surprised," insists Robert Kahn of the Corporation for National Research Initiatives, one of the first to develop a high-speed digital network, "because a lot of the real utility of high-speed nets will be serendipitous."

Part of what is propelling the changes is the simple notion that pictures, especially moving pictures, are more eloquent than words. High-speed transmission of images is about to do what the telephone has done for voice communication and what the fax is starting to do for text and graphics. "Most people's comprehension works around visualization," says Paul G. Huray, of the White House Office of Science and Technology Policy, and networks will make that comprehension "tremendously more transferable."

The principles of this transfer aren't new, but the capacity to transfer enough information quickly enough to make a real difference has only just arrived. The idea for a computer network was born in the 1960s at the Rand Corporation, and realized late in that decade in ARPANET, a network supported by the Defense Advanced Research Projects Agency (DARPA). Despite the Pentagon's financial support, ARPANET was a fairly open environment: pretty much anyone who wanted to get on the network could do so. "The notion when ARPANET was established was that it was primarily to share computing resources," says Douglas E. Van Houweling, vice provost for information technology at the University of Michigan. "As things turned out that wasn't the way it got used. It got used by human beings who wanted to work with other human beings."

In the 1970s a whole variety of networks joined ARPANET to offer connectivity. Some were regional--like Merit in Michigan and BARRNET in the San Francisco Bay Area--linking government and academic institutions together. Others were national, like BITNET and CSNET, which offered nationwide services to academic institutions as well as connections to other countries. Federal agencies like the National Aeronautics and Space Administration and the Department of Energy set up their own networks.

Missing in all of this was the sense of a national framework. Then, in the early 1980s, the National Science Foundation entered the scene with NSFNET. Designed around the six NSF-supported supercomputer centers, NSFNET was intended to form a high-speed backbone for a national network, linking together not only the supercomputer installations but also the regional networks that had sprung up around the country. In 1987, NSF awarded a 5-year contract to a consortium of IBM, MCI Communications Corporation, and the Merit Computer Network to operate and upgrade the NSFNET backbone. When the Merit consortium took over, the backbone operated at 56 kilobits per second. Today those lines have been upgraded to 1.5 million bits per second, and later this year portions of the backbone will be improved to 45 million bits per second.

The appetite of researchers for these facilities has proved voracious. Network traffic is most easily measured in packets, discreet packages of bits that contain address information and some fraction of the particular message being sent. In 1988, just over 100 million packets per month were traveling across the network. In February of this year, the network carried 2.5 billion packets. And the numbers have been growing by an average of 20% per month. "In the last month, 10% of all the information that has ever been sent, was sent," says Huray. "That's an incredible statement."

And the use of the network isn't limited to a select group; everyone is getting into the act. "Some people believe that a fairly few scientists pumping data to and from the supercomputer centers represent the totality of traffic," says Eric M. Aupperle, president of the Merit Computer Network. "That is simply not the case." Instead, Aupperle says the traffic results from "literally thousands of users, and it is relatively infrequent that any one individual comes close to monopolizing the resource."

And these thousands of users are doing many different things with the network. David F. Nitz, a physicist at the University of Michigan, used to wait for weeks or months to get data from an ultrahigh-energy gamma-ray detector in Utah. Now the data comes in within one day along the network. Robert Beekman, a pediatric cardiologist at the University of Michigan, uses the network to access medical data files stored in Pittsburgh when seeing patients in a satellite clinic 2-1/2 hours from his office in Ann Arbor. And literally thousands of scientists start their day in the United States, and increasingly around the world, by reading mail sent by their colleagues. Even for relatively low-speed networks, people get hooked on electronic mail because, as Michael M. Connors, director of Computing Systems at IBM, says, "it beats the socks off the U.S. Post Office."

Getting an exact fit on who is using the network is difficult because of the way network cultures have evolved: it is against the unwritten rules of network protocol to see what is inside an individual packet. But just like the post office can tell where a letter is going without knowing what's inside, network operators can group packets into certain general categories by information in their address fields. Approximately 30% carry electronic mail from one user to another, and about the same amount are transmitting files among users. Another 20% fall under the heading of interactive applications, which include things like using the network to work on a computer at a different institution. And 15% are taken up by directory inquiries.

But the future hasn't yet arrived. Even with the phenomenal recent growth of the NSFNET, we are only now on the verge of a truly national system with the kind of power demonstrated in that Washington hotel room. Both the executive branch and Congress are grappling with how to move ahead with the next generation of a nationwide network. At the request of congress, the Committee on Computer Research and Applications under the OSTP Federal Coordinating Council for Science, Engineering and Technology produced a national strategy for high-performance computing. That report recommended the formation of a high-speed computer network linking government, industry, and universities. A second report, released last fall, outlines specific goals and budgets for what is now dubbed the National Research and Education Network. NREN-1 is essentially the existing NSFNET operating at 1.5-megabit speeds. NREN-2 will be the new network when it is upgraded to 45-megabit speeds. By the end of this decade, NREN-3 will emerge, with data traveling at gigabit volume: billions of bits per second.

And it is at that gigabit level that the quantum jump into a new kind of connectivity will probably take place. Much effort is now going into the implications of that leap. A leader in that effort is Robert Kahn. With teams of researchers drawn from industry and universities, and $15-million support from the NSF and DARPA, Kahn has established five testbeds to determine technology needed to transmit and direct gigabits of data per second.

And as Kahn and his colleagues muse on what the network of 2000 will look like, others are thinking about what will be done with it. "We're trying to determine what applications really require gigabit band width end to end," says Vinton C. Cerf, another network pioneer working at the Corporation for National Research Initiatives. The most likely application for the broad-bandwidth connections is animated images. An image made up of a 1,000 by 1,000 array of points (pixels), where each pixel is made of 24 bits of information updated 30 times per second to give smooth motion, requires (excluding clever data-compression algorithms) 720 megabits per second--and a couple of 720,000-bit-per-second projects use up even a gigabit network in a hurry.

Another project being considered is linking multiple supercomputers together to do calculations in tandem. Kahn points out that you can obtain a more than twofold speedup in computation by joining two special-purpose computers--say, one with a parallel architecture and the other serial--to work on a single problem. The trick, says Cerf, is to pass the data back and forth between the two machines rapidly enough so they can both work as fast as possible. "As you start looking at it, the bandwidth requirements become quite horrendous," he says.

The technical problems will be intimidating--but they'll be solved. The networks will continue to grow by leaps and bounds. But when will all this really begin to change the way science is done? Perhaps only when the networks become more integrated with the culture. MCI Communications vice president Richard Liebhaber says his company believes the key to introducing networks to everyday use is to make them easy to access. "We should be able to deal with images and data the same way we deal with voice," he says. "With the telephone, I just dial the seven-digit number and I'm talking to you." What made facsimile machines popular is that they were just about as easy to use. But facsimile machines transmit static images, and there is limited potential for real time interaction with the image. "All you've done is accelerate the 17th-century technology," he says. When networks become as easy to use as the telephone, they should have a much bigger impact.

Even now, however, before the network has become as easy to use as the telephone, there are indications that it can change the way a researcher operates--if he is persistent and warms to the technology. Take the example of computer science professor Allen Newell of Carnegie Mellon University in Pittsburgh. Van Houweling, who was at Carnegie Mellon before he moved to Michigan, says, "Frankly, [Newell] lives on the network. His intellectual and personal life is executed more over the network than face-to-face communications." Newell agrees that the network is fundamental to his research efforts. "I exist as a member of an extended research community," he says. Newell and his colleagues, who are spread among several institutions, are working on an artificial intelligence project called SOAR. The collaborators use the network for everything from discussing conceptual issues to exchanging software and revising publications. "I cannot imagine that we can run the project without i," he says.

But even the nature of electronic mail is undergoing a change. "One of the exciting things that is happening right now is the whole multimedia explosion. An interaction with computing technology is beginning to take place," says Van Houeling. As a result of that cross-fertilization, mail will no longer be just plain text, but will include data and graphics--even voice. The NeXT computer developed by Steven P. Jobs, for example, can digitize a voice signal that can be added to a mail message and translated back into voice by the recipient.

On another level, the advent of networks will make it possible for scientists anywhere to gain remote access to unique pieces of machinery. The Berkeley-Illinois-Maryland Millimeter Array telescope, for example, is located in Hat Creek, California, but the scientists who run it aren't there: they're in College Park, Maryland--and they run the instrument via the network. It's likely that as more facilities are designed to be operated this way, research workers will be able to spend more time on research and less time waiting in airports.

On a grander level still, the network should make collaborations of a qualitatively new sort possible. At a workshop last year at Rockefeller University, a group of computer scientists, biologists, physicists, and engineers got together to discuss how new networking and computing capabilities could be utilized to forge new types of collaboration. One area that they concluded will benefit from the new technology is the U.S. Global Change Research Program. Supercomputer models, earth and satellite-based sensors, databases, and scientists will all need to send input to, and get output from, one another. There is no communication system up to the task today, but large networks should go a long way to providing the glue that will bind the project together.

The network will also permit the international collaborations to flourish. In Israel, a country with chronically scarce resources for research but thousands of top-rank scientists, researchers aggressively use network links to stay plugged into research abroad. David Mirelman, a biophysics professor at the Weizmann Institute in Rehovot, maintains an active collaboration with Louis Diamond at the National Institute of Allergy and Infectious Diseases on the parasite Entamoeba histolytica--using the BITNET network.

These are all positive influences of the network on the culture of science, but there's also a dark side of the network culture--one that's already appeared. As more and more people gain access to the networks (anyone with "a piece of wet string or a carrier pigeon" can get connected now, says Geoffrey Goodfellow of Anterior Technology), it becomes practically certain that it will be abused--by sophisticated hackers like Robert Morris or computer criminals intent on theft or even international terrorism.

"That's why this whole notion of hackers as guys who screw around in the network for the hell of it or to prove their manhood is really so goddam dangerous," says MCI's Liebhaber. The hacker isn't a romantic figure, according to Liebhaber, he's like someone driving on the wrong side of the road. "We really need to develop a set of cultural and behavioral protocols for using the networks."

Hurray imagines an even more destructive scenario. What would the effect be, he muses, if terrorists got into a country's treasury database and threatened to change all of the sixes into nines? What ransom would a country be willing to pay to avoid being plunged into economic chaos? Huray believes the day will come when entire academic departments are devoted to the offensive and defensive strategies of computer warfare.

All this may still seem a bit farfetched as the 1990s begin. Yet whatever the ultimate consequences, there is general agreement that computer networks have become a technological juggernaut. The choice for the individual scientist is to grab on--or get run over. "The technology is moving so fast that it is really hard to tell what kind of cultural changes will occur," says Huray. "All you can be sure of is something big is going to happen."

 

COPYRIGHT 1990 American Association for the Advancement of Science