Previous Chapter
Contents
Next Section

"Truth is much too complicated to allow anything but approximation."
-John Von Neumann, mathematician, Princeton University (1960).
"If we use, to achieve our purposes, a mechanical agency with whose operation we cannot efficiently interfere once we have started it, because the action is so fast and irrevocable that we have not the data to intervene before the action is complete, then we had better be quite sure that the purpose put into the machine is the purpose which we really desire and not merely a colorful imitation of it."
- Norbert Wiener, founder of cybernetics, MIT (1960).
§ 1. Information Theories

One of the first major theoretical developments made in computing after the war was outlined in a short paper published in 1948 by a former graduate student who'd studied with Vannevar Bush. Claude Shannon had graduated to a position in one of Bell Telephone's research labs and his essay A Mathematical Theory of Communication broke new ground in the field of information studies by establishing the 'bit' as the elementary particle of information flow. This deft re-articulation of what information is, defined along binary lines, gave communication itself a fixed unit of measure, allowing terms like noise, distortion and error to be viewed in a different light.

The same year Shannon published his theory of information, Bell Telephone Laboratories was replacing a great deal of its vacuum tube technology (expensive, fragile and energy inefficient) with a new piece of hardware just released to market by an obscure outfit called Texas Instruments. The component Bell was adopting was called a 'transistor', and Shannon's research seems to have been done in the hopes of optimizing the advantages of this new technology, in conjunction with developments underway in other areas of computing. Shannon later teamed up with Mark Weaver, another Bell technician, to underscore one of the finer points of Information theory, that being where "the semantic aspects of communication are irrelevant to the engineering aspects." 1

This conception of information (which no longer seems terribly shocking) of Shannon's was quite a break with the past. The work of Charles Babbage had sought to mechanize mathematical computation and the telegraph was seen as transmitting information and intelligence via electricity. In the US and around the world, business machines such as Hollerith's tabulator or the Burroughs 2Adding Machine line had been well-established for decades, and had sought to accurately manipulate or process numerical data. Shannon's theory however claimed to offer quantifiable measures of any type of communication, regardless of form (though obviously Bell was interested at that time in sound), and more significantly, irrespective of content. 3

This re-definition brought information into a very technical and delineated area for the first time, insofar is it a) established a set measure for mathematical calculation, the bit, from which others could be derived as needed (the byte, the KB, the Mb, etc.), b) allowed from quantification other formulas and standards to be set (error frequency and correction, signal-to-noise ratio, etc.) and c) moved information and communications squarely into the practical realm of 'engineering' or 'science' once and for all by relinquishing any 'fuzzy' notions of forms of knowledge, methods of communication, shades of meaning or questions of content. 4

At the time Shannon was only extending the process of abstraction which had been applied to information first in the office environment, and then again to cryptanalysis during the war, where reams of statistical data could only be managed if first subjected to some standardization (using typewriters and standard paper forms had been a first step, soon followed by the punch card sorting systems). What the computing work in the war had shown (at least to those interested) was that not only could you process numbers to produce a better ballistics table, which was an application at least a century old even in the 1940s. Rather, the new insight into computing was the machine ability to manipulate language (as if it were numerical), read numbers (as if they were a language) and then act on both through a set of coded instructions. Instructions to the computer's processing unit were unrestricted, any set of equations or functions could be manipulated or reworked endlessly, and so the idea of a 'universal computing' machine emerged with almost boundless areas of possibility. 5 However, in order to exploit the machine's versatility, concessions had to be made to its limited machine logic; thus complicated applications were only possible in areas where information to be processed could be 'converted' properly to 1s and 0s.

This struck many people in the swelling fields of electronics as deeply interesting, but the questions of abstraction, programming and computing did not really grab the attention of anyone outside academic or scientific circles until 1952. In that year, Grace Murray Hopper, working with Mauchly and Eckert on the UNIVAC (Universal Automatic Computer), became a pioneer in computing when she developed the first compiler software to build a statistical analysis program. 6 The Remington-Rand Corporation then used the presidential election of '52 to unveil on live television the possibilities available with the aid of well-programmed UNIVAC unit. During an election night news program the computer successfully predicted, extrapolating from the polling results of a few states, who would win the election. Most political pundits were vocally unsure of Dwight Eisenhower's campaign and felt it would be a close race, but UNIVAC computed otherwise, predicting a landslide which commentators called absurd. Eisenhower carried the day by a wide margin, securing 442 of 538 electoral college votes.

The next traumatic technological jolt for the American populace as a whole (and US scientists and politicians in particular) came on the evening of Oct. 4, 1957, when international news agencies confirmed the Soviet Union had successfully launched, by way of a complicated rocket system, a geo-synchronous satellite called Sputnik. Confidence in the scientific progress of the democratic West was seriously shaken, even as the Explorer satellite was launched Jan. 31, 1958, just three months later. No doubt there was still widespread discomfort and fear (especially in a fully Atomic Age) at the idea of 'falling behind' the Russians. In 1958, the US Congress (tending, even after the fall of Senator Joseph McCarthy, to read the worst in Communism's progress) approved $2 billion in annual funding for ARPA (Advanced Research Projects Agency) to fund general scientific research and NASA (National Aeronautics and Space Agency) was formed specifically to pursue aeronautical engineering and space exploration. 7
Notes:

1 The original paper was latter expanded into a full book. See Claude Shannon and Mark Weaver, Mathematical Theory of Communication (Urbana: University of Illinois, 1964), 8.

2 The Burroughs Business Machine Company, the Hollerith tabulators of IBM and Remington-Rand Office Machines were all direct information technology competitors at this time. Remington had long been established as manufactures of munitions and rifles, while Hollerith has established his company, Computing-Tabulating-Recording Co., after developing a mechanized counting machine on behalf of the US Census Bureau. Burroughs was a relatively new player in their domain (Burrough's grandson, incidentally, was the notoriously drug-addled Beat poet, William S. Burroughs, who tended to write hallucinogenic novels in which machines took on an insectoid intelligence of their own). See Martin Campbell-Kelly and William Aspray, Computer: A History of the Information Machine (NY: BasicBooks, 1996), 34-39.

3 In 1929, Leo Szilard, a Hungarian contemporary of Johnnes Von Neumann, was actually the first to mention the possibility of information as having statistical qualities. He did so in a paper which successfully attacked the paradox of 'Maxwell's Demon', proposed in 1871 by James Maxwell, which posited a possible contradiction to the second law of thermodynamics (entropy). See William Aspray, "Origins of Von Neumann's Theory of Automata," from The Legacy of John Von Neumann : Proceedings of Symposia on Pure Mathematics ( Providence: American Mathematical Society, 1990), v.50, 191.

4 As Albert Borgmann, a technological philosopher points out, "when Claude Shannon wrote his seminal article on information theory, he was concerned to keep the problem he had set himself crisp and clear. So he restricted himself to the structure of signs and explicitly disregarded the questions of what those signs might be about." Critics of information technology and its enthusiasts tend to view Shannon's information theory as a harbinger in the on-going 'datafication' of everything. Theodore Rosak's The Cult of Information (1985) makes some excellent cases for where information cannot be quantified or transmitted without some loss of quality, and Borgmann makes similar claims in Holding on to Reality : the Nature of Information at the Turn of the Millennium (Chicago: University Press, 1999). These are totally valid critiques, surely, for there are certainly endless examples of experiences, sensations and thoughts which defy digitization. However, both writers seem to condemn Shannon for assumptions about information he in fact never made, especially given his original work was largely occupied with signal transmission and stability, not computing per se.

5 The notion of the Universal Turing Machine, a sort of Swiss Army knife all-purpose computational machine goes back years before the war. However, it's important to keep in mind it was only in the during the 1950s that the computer really began to move out of its governmental cradle, as William Sharpe explains, "until the fifties, the computer industry was essentially non-commercial. Each machine was one of a kind and support came primarily from universities and government. In fact, it can be plausibly argued that without government (and particularly military) backing, there might be no computer industry today." The military aspect is also emphasized by Alfonso Molina when he writes "the complex of capital-government-military-scientific interests converged during and after W.W.II to become the dominant social constituency behind the development of microtechnology, particularly in the US where the pressures of the Cold War were, of course, more acute." See W. Sharpe, The Economics of Computers (New York: Columbia University Press, 1987) or A. Molina, The Social Basis of the Microelectronics Revolution (Edinburgh: University Press, 1989).

6 "Ever since the early days of computers, the task of connecting the micro and macro informational worlds (of experience and technology) has fallen steadily to special programming languages...encoding reality into digital strings...out of these formative efforts emerged assembly languages...utilizing systematic, mnemonic codes that eased conversion of problems (experience/reality) into digitized form...S(n)C might mean 'store in the Nth register of memory an object of informational content 'C'...the S, n and C would then be represented by a specific pattern of 1s and 0s...a little later, with greater standardization came 'high-level' compiler languages, like FORTRAN and COBOL...used for direct processing of scientific and commercial data." See Michael Hobart, Information Ages : Literacy, Numeracy & the Computer Revolution (Baltimore : John Hopkins University Press, 1998), 239-240.

7 Significantly, Roy Johnson, VP of General Electric, was selected as ARPA's first director, which formalized the relationship between the scientific, commercial and military research communities in the US. General Electric has been one of the primary defense contractors for the US Department of Defense ever since. In the 1998 fiscal year, as an example, GE and its subsidiaries received contracts in the area of $1.9 billion dollars (though aerospace and missile systems manufacturers, it should be said, make up a far greater share of defense spending, ex. Lockheed Martin at $12.3 billion or Boeing at $10.8 billion that same year). See "National Defence - Contracts." World Almanac and Book of Facts 2000 (Mahwah, NJ: Primedia, 2000), 213.

Previous Chapter
Contents
Next Section

Log in or register to write something here or to contact authors.