Gordon Crovitz’s Wall Street Journal editorial “Who Really Invented the Internet?” (July 23, 2012) generated a lot of blowback owing to factual errors in his recounting of how certain network technologies were developed. As he used this story to support a broader case for limiting government intrusion into technology markets, the responses were often vituperative. Although Crovitz was wrong on some of the history, I agree with his point of view:
“It’s important to understand the history of the Internet because it’s too often wrongly cited to justify big government. It’s also important to recognize that building great technology businesses requires both innovation and the skills to bring innovations to market. As the contrast between Xerox and Apple shows, few business leaders succeed in this challenge. Those who do—not the government—deserve the credit for making it happen.”
The bile of the complaints seems to be motivated more by a dislike of Crovitz’s point of view about government than a desire to set the record straight. Many pointed to a few now-seminal results of certain government funded projects as proof positive that public R&D is necessarily a good thing, that Crovitz’s factual errors indicate he is an ignorant fool, and therefore what he has to say is simply a right-wing polemic.
The confluence of many technology streams made today’s Internet. It is now so ingrained in daily life that it is hard to grasp what an immense project it has been. Suppose, circa 1990, you had issued a request for proposal for a network system that would securely and reliably transmit millions of bits per second between any computer, anywhere in the world, for fractions of a cent per megabyte. Oh, and by the way, this network must also reach hundreds of millions of mobile hand-held endpoints (like those exotic new-fangled cell phones), all kinds of vehicles, any industrial machine, residential appliances, home entertainment systems, and medical devices. It must not have any central control, but all of its parts must follow certain rules (protocols), which will be developed on the fly. I’m sure this would have been viewed as science fiction and/or a grandiose delusion by nearly everyone at the time. So how did all this (and more) happen anyway?
What has been missed in Crovitz’s chorus of condemnation is a recognition of the unquestionably crucial role of entrepreneurs, markets, and risk capital in creating today’s incredible global Internet. I am certain that no single entity, government or private, could have achieved this and that it could only have been a product of free-market capitalism.
Some commenters wondered what kind of network infrastructure we would now have if the early Internet had been a commercial project, suggesting that (snicker, snicker) something like AOL would have been the result. The Internet is and has been realized by tens of thousands of private companies responding (rapidly) to market forces producing network equipment, services, and software.
What if all this had been only a creature of government? We have a few case studies of what happens when government takes control of a public network. In the context of European democracy (France) we get the Minitel system (shut down about a month ago after 30 plus year run.) But more often, a national Internet supports a police state in closed societies like China, North Korea, and Iran. I suspect that if “the Internet” had been wholly a US government project, it probably would resemble the US Postal Service or Amtrak. But then, what would have been the Internet of national Internets? Would we have needed a North American Free Information zone?
Also missing in the chorus of condemnation is a recognition that substantial commercial network technology evolved in parallel with the Arpanet project. (The US Department of Defense funded Arpanet in the late 1960s through the early 1970s. It produced technologies which (after much revision) have become part of today’s Internet. Crovitiz’s recounting of its history provoked the furor.) I worked with several early networks: IBM’s remote data entry and bi-synch protocol, DECNET – the Digital Equipment approach for interconnecting its computers, and ISDN – AT&Ts high speed digital data link for switched networks. I used a statistical multiplexer that achieved a screaming 9600 bits per second, full duplex, over leased lines – very bleeding edge then. The alternative was shipping reels of magnetic tape over hundreds of miles, daily. There were, of course, many others. Unlike government-funded R&D, these technologies had to make money.
Many commercial innovations that enable today’s Internet were privately developed in the 1980s, which built on commercial technology and markets established in the 1970s. I worked with several of them. Compuserve operated a widely-used X25 packet network that used PDP-8s to control many dial-up modems. IBM’s Token Ring LANs and SNA achieved broad use. The Quotron system replaced the punched paper “ticker” tape system that had used the Teletype network to disseminate near real-time market data among US stock exchanges. Cisco was founded at the end of the 1980s to provide high capacity Ethernet equipment, building on the Ethernet stack Xerox pioneered. The Williams Pipeline company (Tulsa, Oklahoma) spun off Williams Telecommunications (WilTel) to pull fiber cable through its gas pipes to create a broadband backbone network. US Sprint built out its long distance service with this, immortalized in Sprint’s “pin drop” ads. I subsequently worked with WilTel to develop automated testing for their frame-relay switching systems, the better to push bits through gas pipes. Although many of these technologies are now long gone, they informed the designs of competitors and provided proof of commercial viability for investors in start-ups like US Robotics, 3 Com, and Cisco, which subsequently took to market technologies taken for granted today.
Many competing network architectures and protocol stacks were in use during these early decades, so the realization of data communication among remote computers was not unique to Arpanet. I’m not sure why TCP/IP came to be a de facto standard in the two layers of the stack it inhabits (and hence its subsequent enshrinement and role in the “invention” of the Internet). It probably has a lot to do with fact that, by the late 1980s, it was proven, available, interoperable, and license-free. As a result of this low barrier to adoption, it was the right technology at the right time and reached dominance in a few years. By 1995, every platform vendor had to have a TCP/IP stack, or they couldn’t sell anything. The unintended lock-in became permanent by the end of the 1990s. TCP/IP is now so deeply embedded in all kinds of networking that replacing it is completely unthinkable.
Several innovations dominated networking in the 1990s: cheap personal computers drove the emergence of client/server computing, which drove demand and development of local-area network technology. Clunky standalone network interface boxes became add-in cards, which became integrated into motherboards. PCs also enabled the simultaneous rise of popular distributed apps (e. g., bulletin board systems and email) and the market for dial-up equipment. With the coincidental development of HTTP and early browsers (government-sponsored at CERN and the University of Illinois), content displayed in GUIs had a whole new dimension – it seemed magic when you first used it. It wasn’t hard to see the staggering commercial potential of “the web” (billions of eyeballs). That inflated the investment bubble of 1998 to 2000. Although a lot of technical failures and financial losses ensued, the bubble also launched technologies and businesses that are key parts of today’s Internet. It also drove an insatiable appetite for bandwidth, which spurred a rapid increase in the availability and capacity of digital broadband – so much so that we had a lot “dark fiber” for a few years. No more.
In the last decade, the US government funded network R&D in many ways, notably its recent focus on “cyber security”. We’ll have to see where this leads. I can say with certainty that today’s Internet is the result of a lot of high-risk investment in technology businesses and, increasingly, the contributions of the open source community. The extent to which markets self-organized to achieve all this is simply astounding. Publicly traded companies that provide equipment, physical plant, and services are essential for today’s Internet. In 2011, these 268 firms sold about $1.4 trillion of related goods and services. That’s real money, even in Washington D.C. Here are the numbers by sector:
|Sector||2011 Revenue, $ Billions *||Number of Public Companies|
|Diversified Communication Services||58.6||30|
|Internet Information Providers||63.2||48|
|Internet Service Providers||2.0||8|
|Internet Software & Services||10.6||24|
|Networking & Communication Devices||56.3||18|
|Telecom Services – Domestic||309.9||22|
|Telecom Services – Foreign||439.4||18|
Many other key players – start-ups and private firms – are not included in this tally. While it is true that results of certain government-sponsored R&D evolved into critical pieces of the Internet as we know it today, it is ludicrous to assert that any single actor (private or public) can claim sole parentage. It is also clear only a highly competitive and profitable market economy could have engendered the dazzling array of technology that we now cannot imagine living without.