Previous Chapter
Contents
Next Section

§ 3. Everything

To jump from the advent of writing to the birth of the World Wide Web might seem a vast leap, but many of the same suspicions and problems soon began to proliferate as the Web grew in size and impact through the early 1990s. In the Phaedrus and Seventh Letter of Plato, the great philosopher was struggling with the educational problems of literacy and writing which were being unleashed on his Greek students and society. Plato was distinctly skeptical then of the new technology, saw it as all flash, and worse that it circumvented proper thought and memory by dislocating words from a man's head and heart. Plato questioned the value of a medium which delivered words and thoughts, but which you could not ask a question or make a counter-argument, vital to his sense of gaining wisdom. How could you judge the credibility or basis of an idea if you could not see where it came from, if it simply circulated into the word detached from its invisible origin. Not surprisingly, many of those criticisms beset any new information or communication technology- the printing press, the telegraph, the radio- were all seen as further abstracting and dislocating the ideas of the world and as sending them spinning into the ether without context or foundation. Establishing the veracity, reliability and stability of information on the Web (as one example) is still one of the Internet's deepest and most difficult problems.
What we are witnessing is not a transition towards 'post industrial' society, where information would take the place of energy and raw materials. We are witnessing the industrialization of data, information, knowledge, even wisdom, in a process which tries to apply to this field of human activity the basic principles of industrial society: standardization, mass production, maximization of output, synchronization of activities, concentration and centralization. 18
Imagine. Not the information economy, or even the knowledge economy, but the "Wisdom Economy"? It's hard to believe the marketers haven't latched on to that yet- soon no doubt. 19 But by 1986 it was very clear the first moves towards the commercialization of the Internet were already well underway (though it would take until 1994 before the major banks and chains like Pizza Hut felt the Internet might be worthwhile). Clearly when the imperatives of commerce run up against the ethics of sharing and free exchange there is bound to be some resulting confusion. By 1986, paid data services like Steve Case's Quantum network in the US (with a slick graphical interface for PCs, Apples and Tandy's) immediately found a niche with consumers and were just as immediately held up to ridicule by the 'serious' on-line communities. Yet by 1989 (and soon with over four million subscribers) Case's outfit morphed into American On-Line and a giant of consumer data-services was born. AOL would go on to swallow its competitors CompuServe and WorldCom (and a few years later Netscape and finally Time-Warner) as the World Wide Web quickly emerged as the information service platform of choice.

The WWW had itself emerged in 1991 from researchers at the CERN Physics Laboratories in Geneva, using HyperText Transfer Protocol (HTTP), HyperText Mark-up Language (HTML) and URL (Universal Resource Locator) standards to build a dynamic, inter-connected document system which would stimulate cross-referential research. The idea immediately caught on. Lynx was an all-text browser used with this system at first, and not long after in 1993, University of Illinois student Marc Andreessen's Mosaic software became the first graphical browser. 20 The introduction of that graphical element seemed to be the final piece falling into place for many and the traditional media outlets (by now in the process of developing electronic media platforms or partnerships of their own) began an unending barrage of coverage and speculation about the commercial and social transformation unavoidably around the corner. From that deluge, one writer Nicholas Negroponte (head of the MIT New Media Lab) seemed to rise above the rest in his 1995 book, Being Digital, characterizing the tone of many as he predicted economic upheaval:
The radical transformation of the nature of our job markets, as we work less with atoms and more with bits, will happen just about the same time the two billion strong labor force of India and China starts to come on-line. A self-employed software designer in Peoria will be competing with his or her counterpart in Pohang. A digital typographer in Madrid will do the same with one in Madras. 21
Certainly the year that book was published, 1995, was a tumultuous one: the North American Free Trade Agreement (NAFTA) had just gone into effect the year before (leading on Jan. 31 to President Clinton's ratifying a $20 billion loan to help stabilize the Mexican peso). The Unabomer was still at large (Theodore Kaczynski was not apprehended until April of 1996) and his anti-technological treatise ran that year in the New York Times after he promised cease his campaign of domestic terrorism. 1995 was also the year the Internet economy was effectively christened by the first explosive Internet start-up IPO (by Netscape 22 ) just coinciding a few months later with the first domain name lawsuit, launched after Sprint had been permitted to register ownership of the site MCI.com (MCI being its main competitor at the time). The legal clashes over content control in cyberspace also extended to Europe as CompuServe Germany unilaterally suspended all sexually-explicit news groups after police raids on their offices seeking evidence of Nazi material posted on their servers.

This contention quickly spread across the Atlantic, and 1996 saw a slew of Internet-inspired legislation: first the US Communications Decency Act to shield children from pornographic material (later struck down by the Supreme Court as unconstitutional), investigations into Microsoft began after Netscape made a formal complaint to the US Department of Justice about the company's anti-competitive practices and finally the issue of digital copyright was tackled by the World Intellectual Property Organization and its forwarding of the TRIPS Agreement (Treaty Regulating Intellectual Property) to signatory countries. For this last item, the US delegate Bruce Lehman pushed very hard on behalf of the American media and entertainment industries to remove 'fair use' clauses entirely by outlawing forwarding, duplicating or even on-line reading without permission of copyright holders. The other international representatives (particularly those from UNESCO) rightfully realized this would lead to even more drastic shifts towards corporate control. After a decade of increasing media concentration and mergers, that seemed a potent threat, and Lehman's lobbying was dropped. True to that pattern, electronic media mergers continued as AOL first paid $4.3 billion US for the Netscape company in 1998 and the year following announced the proposed merger of AOL and the Time-Warner media conglomerate. That last deal closed in the late spring of 2000.

While the volume of information circulation and data flow continues to increase almost exponentially from year to year in the electronic environment, the field of information ownership, publishing and its provision grows increasingly narrow. The sense that the Internet is a two-way medium, of communication rather than transmission, seems to be under increasing pressure as sites and sources continue to consolidate, go public and essentially be expected to turn a profit. For better or worse, netizens are increasingly information consumers, not providers where a user is given access under the tacit understanding they search, but do not disturb or add to the often privately-owned and copyrighted collection. One extraordinarily lucid programmer, Ellen Ullman, sees this structure as familiar in the off-line world:
The same could be said of a library, except that libraries have something the Internet considers nearly anathema: librarians. The current reigning ideology of the Internet is strictly opposed to the idea of a librarian's over- riding sensibility, opting instead for the notion that anything, in and of itself, is worthy content. So it is entirely up to the end user to distinguish junk from literature. Hence the rapid proliferation of search engines. It is interesting to note that, over time, the search engines themselves are beginning to incorporate biases and strategies that could be characterized as ordering sensibilities. 23
This process which Ullman is describing has a well-worn name in software development circles- disintermediation- and it is widely viewed as one of the great pushes in North America behind the development of e-commerce. The theory holds that, where possible, most consumers would rather a DIY (do it yourself) approach to handling of their transactions. By removing the 'intermediary' (be that a bookstore cashier, clothing salesperson, bank teller or postal clerk), you theoretically allow the person seeking service to complete the transaction on their own. Banking and government services were among the first service systems to be streamlined (especially with automated transaction stations or telephone systems), soon after everything from university registration to grocery stores followed suit. With the boom of the World Wide Web, however, it seems every arena of human activity (stock brokers and travel agents have been particularly hard hit) is being rapidly disintermediated. As Ullman points out however, the underlying ethos is that 'ordering sensibilities' are no longer necessary, and that anyone can be an expert and go it alone in the wide-open information frontier. Very little thought seems to be given in this process to people who might need assistance or guidance, which is ultimately where these new decentralized systems may fail many users. Once again, information exchange systems are routinely imprinted and designed for those already savvy enough to navigate them, leaving many aside to fend for themselves as the myriad human sides of services are cut back or eliminated.
Notes:
18 G. Blanc, "Beware of the Information Age," Development I (1985): 78. This process is also called 'datafication' or 'quantization', depending on which branches of the research literature is consulted. The late composer Igor Stravinsky even went so far as to call the process the "statisticalization of the mind", the reduction of the entire mental world to quantity and the debasement of all relationships between people and things to mere numbers. On another track, Vincent Mosco has labeled ours the 'pay-per society', where all appealing forms of information and entertainment are distributed on a per-use basis and much of the information formerly available freely in libraries must now be accessed exclusively through expensive databases. See Mosco, "Information in the Pay-per Society', The Political Economy of Information (Madison: 1988), 3-26.

19 In the early 1980s, in the aiding and abetting of datafication, there was a great deal of deliberation over 'expert systems'. Again, this was a rather vaguely defined bit of vapourware which management theorists discussed at great length as a way to foster organizational learning, but in the last twenty years seems to have dropped from sight: "Experts who have participated in the creation of expert systems commonly report that the process of articulating their knowledge in order to represent it on computers has itself yielded a better body of knowledge and a more complete understanding of what they know. Reflecting on this experience, Donald Michie has proposed the creation of knowledge refineries where such processes could be used routinely to purify and formalize crude knowledge." See Mark Stefik, The Internet Edge: Social, Legal and Technological Challenges for a Networked World (Cambridge: MIT, 1999), 156, and D. Michie "A prototype knowledge refinery." in Intelligent Systems: The Unprecedented Opportunity (Chichester: Ellis Harwood, 1984).

20 Paul E. Ceruzzi, A History of Modern Computing (Cambridge: MIT Press, 1999), 303.

21 Negroponte is neither the first or last to argue along these lines. This transnational capitalism thesis has been expounded at length by dozens of authors, with varying degrees of colorful dystopianism tossed in for good measure. Robert Reich, US Secretary of Labor from 1993-97, steadfastly predicts the fall of the national unit, the Balkanization of regions and the rebirth of powerful city-states governed by 'core elites' of 'symbolic analysts'. Others, like Peter Hall and Pashal Preston pick up on the disappearance of the middle class, predicting a lower class 'conditioned to accept high levels of workplace monotony as the population that operates computer information systems is deskilled' (they add video games are crucial to this conditioning) contrasted against a mobile elite class with few binding ties to region or ethnicity (and subsequently little empathy or sense of responsibility for the world around them). See Nicholas Negroponte, Being Digital (New York: Knopf, 1995), 227, Robert Reich, The Work of Nations: Preparing Ourselves for 21st-century Capitalism (New York: Vintage, 1992) or Hall & Preston, The Carrier Wave: New Information Technology and the Geography of Innovation (1988).

22 On August 8, 1995, Netscape offered shares to the public at $28 per share. By the close of the same day, the shares were valued at $58 and within three months rose to $150 per share before being bought by America On-Line. See Paul E. Ceruzzi, A History of Modern Computing (Cambridge: MIT Press, 1999), 303.

23 Ellen Ullman, Close to the Machine (San Francisco: City Lights, 1997), 78.

Previous Chapter
Contents
Next Section