Get PJ Media on your Apple

The Last of the Global Commons


aglobal.jpg "Mastery of the sea, outer space and the air: these concepts are easy enough to understand -- what other 'Commons' could there be?" by Richard Fernandez, PJM editor, Sydney

by

Bio

March 29, 2007 - 6:31 am

The foundation of American power, wrote Gary Posen in 2003, was far deeper than a mere preponderance in its current military or economic power. It lay in what he called the Command of the Commons — made possible by "all the difficult and expensive things that the United States does to create the conditions that permit it to even consider one, two, or four campaigns".

The U.S. military currently possesses command of the global commons. Command of the commons is analogous to command of the sea, or in Paul Kennedy’s words, it is analogous to “naval mastery.” The “commons,” in the case of the sea and space, are areas that belong to no one state and that provide access to much of the globe.  Airspace does technically belong to the countries below it, but there are few countries that can deny their airspace above 15,000 feet to U.S. warplanes. Command does not mean that other states cannot use the commons in peacetime. Nor does it mean that others cannot acquire military assets that can move through or even exploit them when unhindered by the United States. Command means that the United States gets vastly more military use out of the sea, space, and air than do others; that it can credibly threaten to deny their use to others; and that others would lose a military contest for the commons if they attempted to deny them to the United States. Having lost such a contest, they could not mount another effort for a very long time, and the United States would preserve, restore, and consolidate its hold after such a fight. …

The United States enjoys the same command of the sea that Britain once did, and it can also move large and heavy forces around the globe. But command of space allows the United States to see across the surface of the world’s landmasses and to gather vast amounts of information. At least on the matter of medium-to-large-scale military developments, the United States can locate and identify military targets with considerable fidelity and communicate this information to offensive forces in a timely fashion. Air power, ashore and afloat, can reach targets deep inland; and with modern precision-guided weaponry, it can often hit and destroy those targets.

Mastery of the sea, outer space and the air: these concepts are easy enough to understand — what other "Commons" could there be? But as anyone who lives in this information age can testify, all of us now live on the edge of a pathway to other conciousnesses connected by the one thing as influential in the 21st century as the Mahanian sea was in the 19th: the Internet. Today, the Internet provides the highway for many of the essential activities of modern life: email, file-sharing, audio and video streams, VOIP telephony and the World Wide Web. Yet unlike the sea, cosmos and air which are primeval, the Internet is wholly man-made, and though it belongs "to no one state" it provides "access to much of the globe". Is the Internet another one of the Global Commons and what would it mean to command it? The story of the Internet began with the launch of the Soviet Sputnik in 1958.

The USSR’s launch of Sputnik spurred the United States to create the Advanced Research Projects Agency (ARPA, later known as the Defense Advanced Research Projects Agency, or DARPA) in February 1958 to regain a technological lead. … J. C. R. Licklider … saw universal networking as a potential unifying human revolution. … Licklider recruited Lawrence Roberts to head a project to implement a network, and Roberts based the technology on the work of Paul Baran who had written an exhaustive study for the U.S. Air Force that recommended packet switching … to make a network highly robust and survivable. After much work, the first node went live at UCLA on October 29, 1969 on what would be called the ARPANET, one of the "eve" networks of today’s Internet.

In its present form, the Internet is a highway made up of physical hardware, but equally important, of standard protocols which allow the hardware to be used in a consistent way. Together they enable the world to communicate in a way unthinkable even a few decades ago. One way to understand the distinction between hardware and standards which allow universal comprehension is to consider the terms "Internet" and the "World Wide Web". As Wikipedia puts it:

The Internet and the World Wide Web are not synonymous: the Internet is a collection of interconnected computer networks, linked by copper wires, fiber-optic cables, wireless connections, etc.; the Web is a collection of interconnected documents and other resources, linked by hyperlinks and URLs. The World Wide Web is accessible via the Internet, as are many other services including e-mail, file sharing.

Today, both the Internet and the information standards which govern information flows on it are under the immense strain of its own success. The original design, premised on a world of fixed computers connected by wiring has been patched up to accommodate mobile devices and to host services for which it was never envisioned — with less and less success. In fact, researchers at Stanford University believe that the Internet in its present form is doomed to die, which is why scientists and private industry are reinventing it all the time, often without the knowledge of the public or of governments. In a paper called a "Clean-Slate Design for the Internet", Nick McKeown and Bernd Girod of Stanford describe just one of the many initiatives to find a new architecture for the global network.

The current Internet has significant deficiencies that need to be solved before it can become a unified global communication infrastructure. Further, we believe the Internet’s shortcomings will not be resolved by the conventional incremental and “backward-compatible” style of academic and industrial networking research. …: “With what we know today, if we were to start again with a clean slate, how would we design a global communications infrastructure?”

The Stanford scientists ask, “how should the Internet look in 15 years?” They think the future Internet must be:

  1. Inherently secure against malware like viruses;
  2. Support mobile end-hosts like mobile computers, cellular
    telephones and embedded chips;
  3. Able to allow the market to price services offered on it in a
    rational way;
  4. Support "anonymity where prudent" and
    "accountability where necessary"; and
  5. Able to become virtual itself. The underlying hardware should present a logical representation of itself to every device connected to it. The physical network will wear a "skin" that devices will perceive and it will be able to change under that skin so alterations can be made to it without shutting the network down in much the same way that a computer’s software can be updated while you keep working.

But just as the original Internet, conceived in the depths of the Cold War, was unintentionally a network with a First Amendment time bomb ticking inside it — so too is the Stanford project, and several other initiatives currently underway charged with implicit subversion. An Internet where the protection of "anonymity" was built into the design, whose utilities supported a market valuation of the services it offered, that offered connectivity to mobile devices would be nightmare to totalitarian governments.  Worst of all, a fully programmable, virtual Internet could potentially become anything at all, the ultimate horror scenario for any organization addicted to control. If today’s Internet is already a threat to the established order, it will be as nothing to tomorrow’s Internet, poised to let JCR Licklider’s dream of universal networking become a reality — in a form much more sophisticated and powerful than even he could have imagined.

Just how powerful it might become is illustrated by efforts to transform one major protocol of the Internet — the World Wide Web — from a sea of undifferentiated links into a tagged store of information where information can be gathered, assembled, processed and disseminated automatically. John Borland in the MIT Technology Review writes that todays Web is as disorganized as libraries before the Dewey Decimal System. What was needed, some argued, was a way to allow software agents to get a grip on information, both to bar objectionable material and to find what was truly sought.

Nor was it just librarians who worried about this disorder. Companies like Netscape and Microsoft wanted to lead their customers to websites more efficiently. Berners-Lee [the originator of the World Wide Web] himself, in his original Web outlines, had described a way to add contextual information to hyperlinks, to offer computers clues about what would be on the other end. … the idea of connecting data with links that meant something retained its appeal. … To use an old metaphor, imagine the Web as a highway system, with hyperlinks as connecting roads. The early Web offered road signs readable by humans but meaningless to computers.

General-purpose metadata … would be a boon to people, or computers, looking for things on the Web. … the idea of a "semantic" Web [emerged] which not only would provide a way to classify individual bits of online data such as pictures, text, or database entries but would define relationships between classification categories as well. … To go back to the Web-as-highway metaphor, this might be analogous to creating detailed road signs that cars themselves could understand and upon which they could act. The signs might point out routes, describe road and traffic conditions, and offer detailed information about destinations. A car able to understand the signs could navigate efficiently to its destination, with minimal intervention by the driver. …

In articles and talks, Berners-Lee and others began describing a future in which software agents would similarly skip across this "web of data," understand Web pages’ metadata content, and complete tasks that take humans hours today. Say you’d had some lingering back pain: a program might determine a specialist’s availability, check an insurance site’s database for in-plan status, consult your calendar, and schedule an appointment. Another program might look up restaurant reviews, check a map database, cross-reference open table times with your calendar, and make a dinner reservation.

Immediately efforts to structure the Web provoked a debate which illustrates how the struggle over the mastery of information commons is decided. It is decided by negotiation and standards established by those who use it. At the heart of the debate lay the question of whether the World Wide Web could ever, even in principle, be completely tamed: whether in fact it was better to leave it in unstructured free-form. The way the debate is being resolved describes how the Internet itself changes.

Proponents of an unstructured World Wide Web argued that its looseness was not a disadvantage but its key strength.

"People forget there are humans under the hood and try to treat the Web like a database instead of a social construct," says Clay Shirky, an Internet consultant and adjunct professor of interactive telecommunications at New York University. …"The world is not like a set of shelves, nor is it like a database," says NYU’s Shirky. "We see this over and over with tags, where we have an actual picture of the human brain classifying information."

The proponents in both the “structured” and “unstructured” camps went forward according to their chosen strategies, while remaining compatible with the least common denominators of the World Wide Web. They created islands of varying richness within the same highway network. Some took the tack of letting humans continue to classify and add information in their own chaotic, but creative way.  "The socially networked, tag-rich services of Flickr, Last.fm, Del.icio.us, and the like are already imposing a grassroots order on collections of photos, music databases, and Web pages. Allowing Web users to draw their own connections, creating, sharing, and modifying their own systems of organization, provides data with structure that is usefully modeled on the way people think, advocates say." But for others it made sense to organize their information stores in structured ways that would facilitate access and manipulation — as well as add value.

Life scientists with vast stores of biological data have been especially interested. In a recent trial project at Massachusetts General Hospital and Harvard University … clinical data was encoded using Semantic Web [structured] techniques so that researchers could share it and search it more easily. … Citigroup’s global head of capital markets and banking technology, Chris Augustin, is heading an initiative to use semantic technologies to organize and correlate information from diverse financial-data feeds. …

One of the highest-profile deployments of Semantic Web technology is courtesy of Joost, the closely watched Internet television startup formed by the creators of Skype and Kazaa. The company has moved extraordinarily quickly from last year’s original conception, through software development and Byzantine negotiations with video content owners, into beta-testing of its customizable peer-to-peer TV software … Joost’s … infrastructure also means that users will have wide-ranging control over the service … People will be able to program their own virtual TV networks–if an advertiser wants its own "channel," say, or an environmental group wants to bring topical content to its members–by using the powerful search and filtering capacity inherent in the semantic ordering of data.

In the end, both the structured and freeform camps will probably coexist on the World Wide Web, which is itself being built over an evolving Internet. "Semantic Web technologies add order to data from the outset, putting up the road signs that let computers understand what they’re reading. But many researchers note that much of the Web lacks such signs and probably always will. … No one knows what organizational technique will ultimately prevail. But what’s increasingly clear is that different kinds of order, and a variety of ways to unearth data and reuse it in new applications, are coming to the Web. There will be no Dewey here, no one system that arranges all the world’s digital data in a single framework."

But an Internet in which the users themselves impose "different kinds of order" and "reuse it in new applications" creates an information commons commanded by an implictly American model. Although not under the authority of any US government agency, such an Internet would be almost natively American and consequently confer an advantage upon a society whose institutions and traditions are already configured to use it. Even the evolution of the Internet and its protocols is taking place in an alarmingly non-governmental way, an idea which justly horrifies the United Nations. As a result, the UN has embarked upon a program of mandating "governance" over the Internet, in a manner reminiscent of King Canute exercising his dominion over the sea with a program whose narrowness and bureaucratic cast is self-evident.

Four options for the management of Internet-related public policy issues were proposed in the Final Report of the WGIG, finalised during their fourth meeting, and presented to stakeholders on 18 July 2005 in preparation for the November 2005 meeting in Tunis, Tunisia. These proposals all include the introduction of an open Multi-stakeholder based Internet Governance forum to give greater voice to the stakeholders around the world, including civil society, private sector and governments. Each model also included different strategies for the oversight role, currently held by the United States Department of Commerce.

The proposed models were:

  1. Create the Global Internet Council (GIC) consisting of governments and involved stakeholders to take over the U.S. oversight role of ICANN.
  2. Ensure that ICANNs Governmental Advisory Committee is an official forum for debate, strengthening its position by allowing for the support of various governments.
  3. Remove the U.S. oversight of ICANN and restrict it to the narrow technical role, forming the International Internet Council (IIC) to manage most aspects of the Internet administration.
  4. Create three new bodies:
    • The Global Internet Policy Council (GIPC) to manage "internet-related public policy issues"
    • The World Internet Corporation for Assigned Names and Numbers (WICANN) to take over from ICANN
    • The Global Internet Governance Forum (GIGF), a central debating forum for governments.

The UN “governance” program, in comparison to the intellectual fecundity of the Internet itself, is almost comically sterile; almost like an idiot seeking to take charge of a font of creativity. Yet for all the ham-handedness of the United Nations, it’s basic fear is well-founded. In a world where America already dominates the Commons of the Sea, Outer Space and the Air, can other countries allow the establishment of an Information Commons, the Internet, whose design emphasizes the free flow of information, privacy and enterprise? Every tyranny and theocracy on the planet must regard it as an intolerable danger. On the other hand, do they have a choice?

bio goes here...

Comments are closed.