Unlike the Greek goddess Athena, the Internet did not spring from the head of some Zeusian computer scientist. It was formed by a process of relatively rapid accretion and fusion (but keep in mind that this industry is one in which computer power doubles every few years). In 1980, there were 200 machines on the Internet -- that number is now more than 3.9 million. The grain of sand that formed the heart of this giant electronic pearl came from the U.S. Department of Defense (DoD) in 1969. I'm pleased to be older than the Internet, having been born in 1967, but I'm not enough older to talk authoritatively about world conditions at that time. So, please bear with my secondhand retelling.
In the 1950s, the Russian Sputnik program humiliated the United States. To better compete in the space race, the U.S. space program (at the time under the auspices of the military) received major government funding. That funding came from the DoD under its Advanced Research Projects Agency (ARPA). In the early 1960s, the space program left the military to become NASA, but ARPA remained, and as with many government programs that have seemingly lost their reason to exist, so did its funding. What to do with the money?
The DoD was at that time the world's largest user of computers, so J.C.R. Licklider and others proposed that ARPA support large-scale basic research in computer science. ARPA didn't originally require that the research it supported be either classified or directly related to military applications, which left the door open for far-reaching research in many fields. In 1963, ARPA devoted a measly $5 to $8 million to its computer research, the Information Processing Technologies Office (IPTO), first under Licklider, and then subsequently under the 26-year-old Ivan Sutherland, who had developed an early (perhaps the earliest) graphics program at MIT. After Sutherland, a 32-year-old named Robert Taylor headed IPTO. Taylor managed to double IPTO's budget in a time when ARPA's overall budget was decreasing, and even admitted to diverting funds from military-specific projects to pure computer science.
Around this time, the ARPAnet (Advanced Research Projects Agency Network) got its start, connecting various computers around the country at sites performing research for ARPA. Computers were expensive, and sharing them was the only way to distribute the resources appropriately. Distribution of cost via networks proved to be an important force in the development of the Internet later on as well. Proponents such as Taylor ensured the early survival of the fledgling ARPAnet when it was all too vulnerable to governmental whimsy.
In 1969, Congress got wind of what ARPA was up to in terms of funding basic research with money from the defense budget. Three senators, including the still-active Edward Kennedy, pushed through legislation requiring that ARPA show that its programs were directly applicable to the military. In the process, ARPA's name changed to reflect its new nature; it became the Defense Advanced Research Projects Agency, or DARPA. (Years later, the name changed back to ARPA again, just to confuse the issue.) Bob Taylor became entangled in some unpleasant business reworking military computers in Saigon during the Vietnam War and left DARPA shortly thereafter. He was succeeded by Larry Roberts, who worked in large part in getting the then two-year-old ARPAnet up and running. Stewart Brand, founder of The Whole Earth Catalog, wrote at the time:
At present some 20 major computer centers are linked on the two-year-old ARPA Net. Traffic on the Net has been very slow, due to delays and difficulties of translation between different computers and divergent projects. Use has recently begun to increase as researchers travel from center to center and want to keep in touch with home base, and as more tantalizing sharable resources come available. How Net usage will evolve is uncertain. There's a curious mix of theoretical fascination and operational resistance around the scheme. The resistance may have something to do with reluctance about equipping a future Big Brother and his Central Computer. The fascination resides in the thorough rightness of computers as communication instruments, which implies some revolutions. (Stewart Brand, in II Cybernetic Frontiers, Random House, 1974)
So if DARPA had to justify the military applications of its research, what survived? Well, the ARPAnet did, and here's why: As leaders of the free world (pardon the rhetoric), we needed the latest and greatest methods of killing as many other people as possible. Along with offensive research must perforce come defensive research; even the DoD isn't so foolish as to assume we could wage a major war entirely on foreign soil. For this reason, the tremendous U.S. interstate highway system served double duty as distribution medium for tanks and other military hardware. Similarly, the Internet's precursor was both a utilitarian and experimental network. ARPAnet connected military research sites (hardware was expensive and had to be shared) and was also an experiment in resilient networks that could withstand a catastrophe -- including, in the imaginations of the DoD planners of the day, an atomic bomb.
Interestingly, the resiliency of the ARPAnet design, as carried down to the Internet, has led some to note that the Internet routes around censorship as it would route around physical damage. It's a fascinating thought, especially in regard to Stewart Brand's earlier comment about Big Brother. If anything, the Internet actually has served to reduce the threat of a Big Brother, because it makes communication between people so fluid and unrestricted. But, I anticipate myself.
As a result of the machinations described previously, the Internet Protocol, or IP (the second half of TCP/IP) was created. Essentially, the point behind IP systems is that each computer knows of or can determine the existence of all the others and thus route packets of information to its destination via the quickest route. While doing this, they are able to take into account any section of the network that's been bombed out or has merely been cut by an overenthusiastic telephone repairperson. This design turns out to work well; more importantly, it makes for an extremely flexible network. If your computer can get a properly addressed packet of information to a machine on the Internet, that machine will worry about how to deliver it, translating as necessary. That's the essence of a gateway -- it connects two dissimilar networks, translating information so that it can pass it transparently from one to the other.
In the early 1980s, the military began to rely more and more heavily on the ARPAnet for communication, but because the ARPAnet still connected a haphazard mix of research institutions, businesses doing defense work, and military sites, the military wanted its own network. And so the ARPAnet split in half, becoming the ARPAnet and the Milnet (Military Network). The ARPAnet continued to carry traffic for research sites, and even though the military now had its own Milnet, traffic passed between the ARPAnet and the Milnet by going through gateways.
The concept of gateways proved important in the history of the Internet. Alongside the development of the Internet came the development of a number of other, generally smaller, networks that used protocols other than IP, such as BITNET, JANET, and various others. These also included some like Usenet and CSNET that didn't care what protocols were used. These networks were regional or dedicated to serving certain types of machines or users.
Perhaps the largest driving force behind the Internet is that of the need to connect with other people and other networks. The grass is always greener on the other side of the fence, and gradually gateway sites sprung up so that email could pass between the different networks with ease.
I'm going to take a brief break from the Internet itself, because at approximately the same time the ARPAnet split, a whole host of other networks came into being, probably the most interesting of which was Usenet, the User's Network.
Usenet started in 1979, when two graduate students at Duke decided to link several Unix computers together in an attempt to better communicate with the rest of the Unix community. The system they created included software to read news, post news, and transport news between machines. To this day, that simple model continues, but whereas once two machines were on Usenet, today there are hundreds of thousands. The software that transports and displays Usenet news now runs on not just Unix machines, but on almost every type of computer in use on the networks. The topics of discussion have blossomed from Unix into almost any conceivable subject -- and many inconceivable ones. Like all the other network entities, Usenet quickly grew to be international in scope and size.
Unlike many of the other networks, Usenet truly grew from the bottom up, rather from the top down. Usenet was created by and for users, and no organization -- commercial, federal, or otherwise -- had a hand in it originally. In many ways, Usenet has provided much of the attitude of sharing that exists on the Internet today. In the past, you usually got a Usenet feed (that is, had another machine send news traffic to your machine) for free (other than your telephone charges) as long as you were willing to pass the feed on to someone else for free. Due to commercial pressures, the days of the free feeds are essentially no more, but the spirit of cooperation they engendered remains in much of what happens on the Internet.
I don't want to imply that Usenet is this happy carefree network where everything is free and easy, because in many cases it's a noisy, unpleasant network that exists because of the utility of some of the information that it carries. Despite the attitude toward sharing, the survival of Usenet is due in large part to the resourcefulness of network administrators at major sites. Faced with mounting telephone charges for long distance calls between Usenet hosts, these administrators found a way to carry Usenet news over the TCP/IP-based Internet rather than just the previous modem-based UUCP connections. Thus, they prevented the costs of carrying Usenet from coming to the attention of the bean counters poised to strike unnecessary expenses from their budgets. The TCP/IP connections of the ARPAnet, and then the Internet, were already paid for. So, by figuring out how to carry Usenet over those lines, the network administrators managed to cut their costs, keep users happy, and save Usenet from itself in the process. In other words, Usenet may be an anarchy, but it wouldn't stand a chance without some occasional help from high places.
Shortly after Usenet took its first faltering networked steps, Ira Fuchs of City University of New York and Greydon Freeman of Yale University decided to network their universities using IBM's then-new NJE communications protocol. Although this protocol later expanded to support Digital Equipment's Vaxen running VMS and even some implementations of Unix, the vast majority of machines on BITNET (the "Because It's Time" network) have always been IBM mainframes. Fuchs and Freeman made their connection in the spring of 1981. BITNET grew rapidly, encompassing over 100 organizations on 225 machines by 1984, and reaching, in 1994, the level of 1,400 organizations in 49 countries around the world. Most BITNET sites are at universities, colleges, and other research institutions.
BITNET has always been a cooperative network; members pass traffic bound for other sites for free, and software developed by one has been made available to all. Unlike Usenet, however, BITNET developed an organizational structure in 1984. This took the form of an Executive Committee, made of up representatives of all the major nodes on the network. Also in 1984, IBM provided a large grant that provided initial funding for centralized network support services. This grant, coupled with the fact that most of the machines on BITNET were IBM mainframes, gave rise to the erroneous rumor that BITNET was an IBM network. In 1987, BITNET became a nonprofit corporation. In 1989, it changed its corporate name to CREN, the Corporation for Research and Educational Networking, when it merged its administrative organization with another of the parallel educational networks, CSNET (the Computer+Science Network). Today, BITNET is in something of a decline, due in large part to the nonstandard NJE protocol in an increasingly IP world.
The next big event in the history of the Internet was the creation of the high-speed NSFNET (National Science Foundation Network) in 1986. NSFNET was developed to connect supercomputer sites around the country. Because supercomputers are terribly expensive, the NSF could afford to fund only five (and even then they received some major financial help from companies like IBM). With this limited number, it made sense to network the supercomputers so that researchers everywhere could use them without traveling great distances. At first, the NSF tried to use the ARPAnet, but that attempt quickly became bogged down in bureaucracy and red tape.
The NSF therefore decided to build its own network. Merely connecting the five supercomputer sites wasn't going to help the vast majority of researchers, of course, so the NSF created (or used existing) regional networks that connected schools and research sites in the same area. Then those networks were connected to the NSFNET.
To quote from W.P. Kinsella's Shoeless Joe, "If you build it, they will come." Perhaps not surprisingly, once all of these networks were able to communicate with one another, the supercomputer usage faded into the background. Other uses, most notably email, became preeminent. One of the important features of the NSFNET was that the NSF encouraged universities to provide wide access to students and staff, so the population of and traffic on the net increased dramatically.
In 1987, the NSF awarded a contract to a group of companies to manage and upgrade the NSFNET. This group was made up of IBM, MCI, and Merit Network, which ran the educational network in Michigan. The group dealt with the massive increase in traffic by replacing the old lines with much faster connections.
Eventually the NSFNET had entirely supplanted the ARPAnet, and in March of 1990, the ARPAnet was taken down for good, having played the starring role for 21 years. Similarly, another national network, CSNET, which had connected computer science researchers around the country, closed its electronic doors a year later, all of its traffic having moved to the faster NSFNET.
The NSFNET is all fine and nice, but in many ways it discriminated against "lower" education -- two-year colleges, community colleges, and the much-maligned K-12 schools. To save the day, then-Senator Al Gore sponsored a bill, passed in December of 1991, called the "High-Performance Computing Act of 1991." Gore's legislation created a new network on top of (and initially using) the NSFNET. This new network is called the interim NREN, for National Research and Education Network. Along with providing even faster speeds when feasible (at which point the "interim" will go away), the NREN specifically targets grade schools, high schools, public libraries, and two- and four-year colleges. In working with the thousands of people who subscribe to TidBITS, I see a lot of email addresses, and it's clear to me that these educational institutions are joining the Internet in droves. A day rarely passes when I don't see something from someone whose address clearly labels him or her as a teacher at a grade school or even a student in a high school.
Alert readers probably have noticed that NREN looks a lot like CREN, and in fact, the acronyms are similar -- with reason. CREN recognizes the need for an integrated National Research and Education Network. In fact, as the IBM-created NJE protocol gradually disappears in favor of the more powerful and popular IP, CREN has said it will disband, merge with NREN, or cooperate with it as appropriate -- though only when NREN exists with access rules, funding, and usage policies that allow a clean transition. Currently, CREN feels that the interim NREN, the NSFNET, does not provide consistent policies regarding these issues. And, of course, what happens if commercial organizations end up running some large part of the NREN?
Along with the NREN taking over the part of the Internet that was the NSFNET, more and more of the Internet is being created and run by commercial organizations. All a commercial provider has to do is to pay for its part of the network, just as universities pay for their connections and government departments pay for theirs. The difference is that unlike universities or government organizations, commercial providers want to make money, or at least break even, so they in turn sell access to their machines or networks to other providers or to end users.
The gut reaction to the commercialization of the Internet from the old-timers (who remember when you could get a Usenet feed merely by asking) is often negative, but most people believe that the Internet must accept commercial traffic. In part, this response is true because the only alternative to accepting commercial traffic is actively rejecting it, and no one wants to sit around censoring the Internet, were that even possible.
Commercialization also allows small organizations to create the equivalent of wide area networks that previously only large businesses could afford. A company such as Microsoft can spend the money to install an international company network, but few companies are so large or so wealthy. Many may not need such an international network, but may need enhanced communications. Email can be a powerful medium for business communication, just as it is for personal communication. And, if transferring a file via FTP or email can save a few uses of an overnight courier, the connection can pay for itself in no time.
In addition, whereas in the past you had to work at a large business or university to gain Internet access, it has become far easier for an individual to get access without any such affiliation, although the costs are of course more obvious. Easier independent access couldn't have happened without increased participation by commercial interests.
The commercialization issue has another side. The U.S. government still runs the interim NREN, which is a large portion of the Internet and connects many of the major educational sites. As more commercial providers get into the business and see the massive interest in the Internet, they increasingly think that the government should turn the public portions of the Internet over to them. This thought has much support because the commercial providers could make money, which is what they want to do, and the government could save money, which is what many people want the government to do.
In fact, in the summer of 1993, an impassioned plea was zapping around the Internet. This plea, poorly worded and ambiguous, claims that the government is indeed proposing to sell off the Internet -- lock, stock, and barrel -- which, the message claims, may result in millions of people losing free Internet access. Coincidentally, as I wrote the second edition of Internet Starter Kit for Macintosh in the spring of 1994, another such message had just appeared, although this time from the Taxpayer Assets Project (TAP), a nonprofit government watch organization. The TAP letter claims that the National Science Foundation is proposing to contract with four telephone companies to provide the high-speed Internet backbone, and -- the claim continues -- that usage-based pricing will appear on the Internet as a result, harming the Internet in the process. In an informal rebuttal posted to a Cornell mailing list, M. Stuart Lynn, then the head of Cornell Information Technologies, noted that the Internet is a global network and some countries, such as New Zealand, already have usage-based pricing. So even if the NSFNET moved to usage-based pricing, most of the Internet wouldn't be affected. Stuart Lynn also commented that the federal subsidy is trivial to many institutions, and at Cornell is equivalent to two cans of beer per student per year. In other words, even if Cornell had to rely on a completely commercial network (which might or might not be usage-based), its costs would not change noticeably.
NOTE: It's worth noting that people like flat-rate fees for most things (telephone service and cable service come to mind), and most personal Internet accounts from commercial providers have been usage-based, with only a recent trend toward flat-rate service in the past year. I believe the increasing number of flat-rate SLIP and PPP accounts from various commercial providers was helped in part by the first edition of this book, with its flat-rate offer for SLIP access from Northwest Nexus. I'm unaware of any other widely available flat-rate accounts that predate the offer from Northwest Nexus. Of course, I could be wrong, but I like to think I had a positive influence.
Such dire warnings of impending Internet doom -- some real, most not -- appear every few months. It's difficult to determine which you should act on. My advice is: do nothing until you have sufficient facts to cause you to believe that the danger is indeed real. Regarding the government sellout scare, no contact information or pointers to current legislation exist, which makes it hard to believe without more corroboration. The TAP claim seemed more serious until I saw Stuart Lynn's message discussing how this wouldn't affect Cornell (and presumably, many other institutions) at all.
The trick is to remember that someone always pays for the Internet. If you have a free Internet account thanks to your school, remember that the institution is paying for that connection and funding it in part from your tuition. If your workplace offers Internet access and doesn't limit your use of it, consider that a benefit of working there, along with retirement and health benefits. And an increasingly large number of people, like me, pay directly, usually somewhere between $5 and $30 per month. Sure beats cable television.
Remember how I previously said that the NSFNET was created to carry supercomputer traffic but soon found itself being used for all sorts of tasks? That's another basic principle to keep in mind about how the Internet is funded. The network links were created for a specific reason (supercomputer access), and because of that reason, the money necessary to create and maintain those links was allocated in various budgets. Thus, when traffic unrelated to the supercomputer access travels on the same network, it's piggy-backing on the lines that have already been paid for out of existing budgets. So it seems free, and as long as the ancillary traffic doesn't impinge on the supercomputer access, no one is likely to complain. It's much like using your friend's computer's processing power to generate processor-intensive pictures when he's not using his computer. As long as your use doesn't slow down the things he wants to do, he probably won't mind, especially if it helps you finish your work sooner. But, if your use prevents him from doing his own work, he'll probably become less generous about it.
So, if the Internet did indeed move from governmental to private control, most people would not see the difference because their organizations would continue to foot the bill, especially if the costs didn't change. The danger is to poorly funded organizations such as grade schools and public libraries, which may only be able to afford their Internet connections with help from the government. Oh, and where do you think the government gets the money? Taxes, of course. So you end up paying one way or another.
After all of this discussion, you're probably confused as to who runs what on the Internet. Good, that's the way it should be, because no one person or organization runs the Internet as such. I think of the Internet as a collection of fiefdoms that must cooperate to survive. The fiefdoms are often inclusive as well, so one group may control an entire network, but another group controls a specific machine in that network. You as a user must abide by what both of them say, or find another host.
I don't mean to say that there aren't some guiding forces. The NSF exercised a certain influence over much of the Internet because it controlled a large part of it in the NSFNET. Thus, the NSF's Acceptable Use Policies (which state that the NSFNET may not be used for "commercial activities") became important rules to follow, or at least keep in mind, and I'll bet that many commercial providers used them as a starting point for creating their own less restrictive, acceptable use policies.
Several other important groups exist, all of which are volunteer-based (as is most everything on the Internet). The Internet Architecture Board, or IAB, sets the standards for the Internet. Without standards the Internet wouldn't be possible, because so many types of hardware and software exist on it. Although you must be invited to be on the IAB, anyone can attend the regular meetings of the Internet Engineering Task Force, or IETF. The IETF's meetings serve as a forum to discuss and address the immediate issues that face the Internet as a whole. Serious problems, or rather problems that interest a sufficient number of volunteers, result in working groups that report back to the IETF with a recommendation for solving the problem. This system seems haphazard, but frankly, it works, which is more than you can say for certain other organizations we could probably name.
Other networks undoubtedly have their controlling boards as well, but the most interesting is Usenet, which has even less organization than the Internet as a whole. Due to its roots in the user community, Usenet is run primarily by the community, as strange as that may sound. Network administrators control what news can come into their machines but can't control what goes around their machines. The converse applies as well -- if a sufficient number of network administrators don't approve of something, say a newsgroup creation, then it simply doesn't happen. Major events on Usenet must have sufficient support from a sufficient number of people.
Of course, some people's votes count more than others. These people are sometimes called net heavies because they often administer major sites or run important mailing lists. The net heavies consider it their job (who knows how they manage to keep real jobs with all the work they do here) to keep the nets running smoothly. Even though they often work behind the scenes, they do an excellent job. Shortly after I started TidBITS, for instance, I was searching for the best ways to distribute it. I wasn't able to run a mailing list from my account at Cornell, and TidBITS was too big to post to a general Usenet group every week. After I spoke with several of the net heavies, they allowed me to post to a moderated newsgroup, comp.sys.mac.digest, that had up to that point been used only for distributing the Info-Mac Digest to Usenet.
If you want to get involved with what organization there is on the Internet, I suggest that you participate and contribute to discussions about the future of the nets. Gradually, you'll learn how the system works and find yourself in a position where you can help the net continue to thrive.
You should keep one thing in mind about the Internet and its loose controlling structure: It works, and it works far better than do most other organizations. By bringing control down to almost the individual level but by requiring cooperation to exist, the Internet works without the strong central government that most countries use and claim is necessary to avoid lawlessness and anarchy. Hmm...
The Internet makes you think, and that's good.
I hope this chapter has provided a coherent view of where the Internet has come from, along with some of the people and networks that were instrumental in its growth. After any history lesson, the immediate question concerns the future. Where can we expect the Internet to go from here?
I'm an optimist. I'm sure you can find someone more than happy to tell you all the horrible problems -- technical, political, and social -- facing the Internet. I don't hold with such attitudes, though, because something that affects so many people around the world didn't appear so quickly for no reason. In one way or another, I think most people understand on a visceral level that the Internet is good, the Internet is here to stay, and if they want to be someone, they would do well to get access today and contribute in a positive fashion. Of course, books like this one only encourage such utopian attitudes.
In any event, I predict that the Internet will continue growing at an incredible rate. You might make an argument for the rate of growth slowing from its 15 percent per month rate based on the fact that it's silly to assume that anything can continue to grow at such a breakneck speed. A naysayer also might point at the massive influx of novices as endangering the Internet, or point at the increased level of commercialization as a major problem. I feel that such growth is self-propelling and that bringing more people and resources onto the Internet only further fuels the expansion. I think that growth is good— -- the more people, the more resources, the more opinions, the better off we all are.
I also expect to see the Internet continue to standardize, both officially and informally. At lower levels, more computers will start to use IP instead of BITNET's NJE or the aging UUCP protocols. It's merely a matter of keeping up with the Joneses, and the Joneses are running IP. At a higher level, I think that using various network resources will become easier as they start migrating toward similar interfaces. Just as it's easy to use multiple applications within Windows because you always know how to open, close, save, and quit, so it will be easier to use new and enhanced services on the Internet because they will resemble each other more and more. Even now, people rely heavily on network conventions such as prefixing site names to indicate what services they provide, like ftp.tidbits.com for FTP, gopher.tidbits.com for Gopher, and www.tidbits.com for the World Wide Web.
And yes, I fully expect to see the Internet become more and more commercial, both in terms of where the service comes from and in terms of the traffic the Internet carries. However, we must remember the old attitudes about commercial use of the Internet. In the past, commercial use was often acceptable if it wasn't blatant, was appropriately directed, and was of significant value to the readers. In other words, I'll be as angry as the next person if I start receiving automatically generated junk email every day, just as I receive junk mail via snail mail. If such things start happening, the course of action will be the same as it always has been: politely ask the originator to stop once, and then, if that doesn't work, flame away -- that is, send back an outrageously nasty message.
Even though I'm optimistic, I know that problems will occur. For example, consider the so-called Green Card debacle. In the spring of 1994, the husband and wife law firm of Canter & Siegel posted a blatantly commercial message advertising a green card lottery and immigration services. That wasn't the problem. The problem was that they posted it to all 5,000 Usenet newsgroups, an act called spamming. Discussions about Celtic culture, communications (where I first saw it), and Washington state politics were all interrupted, along with thousands of others completely apathetic about anything to do with immigration. Or at least they were apathetic until they were bludgeoned repeatedly with Canter & Siegel's post. All of a sudden, everyone cared a great deal about immigration, and sent 30,000 flame messages (more than 100 megabytes of text) to the offenders. That many messages was far more than Canter & Siegel's provider, Internet Direct, could handle, and its machine went down like a boxer on the wrong end of a knockout punch.
The aftershocks keep coming, with Internet Direct suing Canter & Siegel for violating acceptable use policies (it seems that Canter & Siegel never signed the terms and conditions form) and for the detrimental effect the post had on business. In return Canter & Siegel countersued for loss of business, claiming some ludicrous percentage of the messages were requests for more information (though they refuse to provide any verifiable data). Needless to say, Internet Direct disabled their account immediately, and details about Canter & Siegel's history began to surface. They'd been kicked off of other providers for similar smaller-scale posts in the past, they'd been suspended from the bar in Florida in 1987 for conduct the Supreme Court of Florida deemed "contrary to honest," and so on. Canter & Siegel garnered a huge amount of press (most of it negative, but as the saying goes, "I don't care what you say about me as long as you spell my name right."). They have even set up a company to provide services to other companies who wanted to flood Usenet with advertising, and they've written a book about how to advertise on the Internet. That's a bit like ex-serial cannibal Jeffrey Dahmer writing a book about preserving meat.
The Canter & Siegel fiasco raises the question of how the Internet should be policed. In the past, and the present, any transgression has been dealt with much as it might have been in the perhaps-fictional view of the American Old West. Everyone takes justice into his own hands, and if a few innocents are hurt in the process, well, it was for the greater good. When Canter & Siegel spammed Usenet, thousands of people spammed them back.
This process is more commonly known as mail bombing. Mail bombs are generally small Unix programs (before you ask, I don't know of any for Windows) that simply send a user-configured number of messages (using a specified file as the message body) to a given address, potentially ensuring that none of the mail bomb messages come from real addresses. A better solution came from a Norwegian programmer, who created a spambot (his term, not mine) program that somehow figures out which newsgroups Canter & Siegel spammed (yes, it happened again, although on a smaller scale) and bounces the spamming message back to them, along with a short note daring them to sue him, since he's in Norway.
Frontier justice sounds like great fun, especially when slimy lawyers are on the other end, but it raises some interesting issues. Mail bombing a machine doesn't affect just that machine -- it affects many of the machines nearby on the Internet. In the case of a public machine like Internet Direct's indirect.com, it also hurts an innocent business and hundreds of innocent users who also use that machine. And, although the Internet as a whole can deal with the occasional mail bomb attack, if such things happened every day, they would seriously impair Internet communications. Such possibilities raise the specter of regulation, something that most Internet users disapprove of (though certain usage regulations are built into the service agreements of almost every Internet provider for liability reasons). So, will the government get involved and lay down the law about inappropriate Internet use? Probably not. The people who must do the regulation are the providers themselves -- there's no way to prevent everyone from retaliating from such spam attacks as Canter and Siegel's, so the best place to stop them is at the level of the providers. They can simply refuse to give problem users an account or remove accounts when abuse occurs. But the government itself? I certainly hope not.
I don't believe that the Internet will ever be governed to a much greater extent than it now is (at least in the U.S.), simply because I don't believe it's feasible. How can you govern something that spans the globe or police something that carries gigabytes of data every day? The U.S. government could easily ban pornographic postings, say, but how does that affect someone from a different country? Or how does that affect U.S. users retrieving the pornographic images from another country? Remember, the Internet can just route around censorship. It's all very confusing, and it will be some time (if ever) before the government understands all of the issues surrounding the Internet sufficiently to produce reasonable legislation. Of course, that begs the question of unreasonable legislation, but that's always a fear.
The way the government as a whole currently views the Internet reminds me a bit of the joke about how to tell if you have an elephant in your fridge. The answer is by the footprints in the peanut butter -- it's the middle of the night, and the government is standing at the open door, yawning and blinking at those massive footprints. Luckily, different parts of the government are starting to wake up, which should help dispel the dangerous ignorance that has marked certain past government Internet actions. For example, there was the Steve Jackson case, in which the Secret Service completely inappropriately confiscated the computer systems of a popular publisher of role-playing games. The damage award from that case enabled Steve Jackson Games to create an Internet provider called Illuminati Online (io.com). Perhaps the greater problem now with the government's view of the Internet is that it seems more concerned with regulating occasional outrageous behavior than with using the power of the Internet to further the public good. Personally, I prefer my government to be more interested in helping than in regulating. Of course, then there are the people who would prefer that the government just stayed out of the way, but somehow I doubt that will happen any time soon.
I've tried to give a glimpse of the history of the Internet, from its first stumbling steps as the military- and research-based ARPAnet to the swift NSFNET and the commercial providers of today. If nothing else, it's worth noting that those who ignore history are condemned to repeat it, and by paying attention to the mistakes of the past, perhaps we can avoid making them again in the future. The future will also bring new problems and new opportunities, but for the moment we can only speculate as to what those might be. But put all that out of your mind, because the next chapter takes you on a tour of the Internet of today.