Really early computational devices

The earliest computational devices were actually memory aids in the form of wet clay tablets or even pebbles organized into piles which were used to perform and record calculations. The most famous such device is undoubtedly the abacus which was developed independently by the Chinese and the Romans.

The concept of an algorithm

The 4th century B.C. Greek mathematician Euclid's work The Elements contains the earliest recorded description of specific algorithms. This work also contains the earliest known attempt to formally define what an algorithm is.

The word algorithm

Abu Ja'far Muhammad ibn Musa al-Khwarizmi of Baghdad was the early 9th century author of a variety of quite important mathematical texts. His name, al-Khwarizmi, (say it quickly) is generally considered to be the source of the word algorithm.

The invention of binary encoding

The Inca civilization (founded in about 1200AD, destroyed in 1532 by the Spanish) almost certainly used a form of binary encoding in their khipus. A khipu has a primary cord from which secondary cords are attached. Tertiary cords are sometimes attached to the secondary cords. The cords are knotted either individually or together (i.e. forming a weave-like pattern).

Although khipu were and are decorative objects, they also stored information. There is a seven layer decision making process which the maker of the khipu follows when creating it (whether to use cotton or wool, whether to use "spin" or a "ply" cords, whether to hang the secondary cord from the front or the back of the primary cord, etc.). The first six decision points each have two possible values or results. The seventh decision point, colour, had twenty four possible values. The result is a system capable of hierarchically encoding 1536 different values (26 * 24).

Pascal's calculator

Blaise Pascal invented a mechanical calculator capable of adding and subtracting numbers in 1642. Pascal was awarded a royal monopoly by the King of France in 1645 to produce his pascaline calculator. The extreme difficulty of producing the many small components of each calculator resulted in it not being a commercial success. Only eight of the roughly fifty 'pascaline' calculators built by Pascal still exist today.

One interesting point is that Pascal's method of performing subtraction by adding the complement of the subtrahend to the minuend is (almost?) identical to how subtraction is performed by modern computers although Pascal's device operated in base 10 (decimal) whereas modern computers operate in base 2 (binary).

The Leibniz calculating machine

Gottfried Wilhelm Leibniz, a lawyer by trade, invented an improved calculating machine in 1673. Leibniz's machine was capable of mechanically calculating the four primitive arithmetic operations of addition, subtraction, multiplication and division.

Vaucanson's automated creations

Jacques de Vaucanson was a French inventor of automated devices. His most famous creation was an automated duck which could flap its wings, "eat" fish, "digest" them and even "defecate" the result!

Vaucanson was appointed the inspector of silk manufacturing in 1741. In addition to introducing many significant process improvements into the industry, he developed an automated loom which was at least partially controlled by punched cards. Apparently, fear of the impact that the loom might have on the industry lead to it being rejected by weavers.

Although not strictly speaking "computational devices", his creations represented a key stage in the development of machines which could be controlled "programmatically".

The Jacquard loom

Joseph-Marie Jacquard invented an automated loom controlled by a series of punched cards (i.e. a stored program) in France in 1804. The invention was at least partially based on Vaucanson's loom. Jacquard's loom was NOT very popular with textile workers as they believed that it threatened their livelihood. It was a technical success in that it allowed the creation of woven cloth with essentially arbitrarily complex patterns.

The Difference Engine

By the early 1800s, a variety of books of mathematical tables had been published. These books of logarithmic, trigonometric and other mathematical tables were used by navigators, engineers, artillerymen, insurers, bankers and other professionals. The problem was that the tables contained mistakes introduced by the process of manually calculating the hundreds or even thousands of individual values (i.e. numbers) in each table.

In 1822, Charles Babbage proposed building a calculating machine which could be used to mechanically produce "correct" tables. The machine was designed to be capable of producing tables accurate to 18 digits (with the last digit properly rounded) and could be configured to perform certain calculations to 30 decimal digits. Funding was provided by the British government and construction of Babbage's Difference Engine began in ernest. The project was immediately challenged by the inability of then current manufacturing processes to produce the mechanical components with a sufficient degree of precision. By about 1827, the project was completely stalled due to a lack of funding and Babbage's inability to accept anything less than perfection.

Babbage produced detailed plans for a second Difference Engine (Difference Engine 2) although only a handful of components were ever built.

The Analytical Engine and the invention of programming

Babbage effectively abandoned the Difference Engine project in the early 1830s when he conceived of what can only be described as a true computer. The Analytical Engine would have memory, a processing unit (called a "mill") and an input/output mechanism. Probably inspired in part by the Jacquard loom, the Analytical Engine would be controlled by a series of punched cards.

Ada Byron, the daughter of Lord Byron, attended a dinner party in 1834 at which Charles Babbage described his Analytical Engine. Babbage's suggestion that 'a calculating machine should be able to not only forsee but act on that foresight' caught Ada's attention.

Babbage gave a seminar in Turin, Italy in 1841 where he described his Analytical Engine idea. His description was written up in Italian by Luigi F. Menabrea. Ada, now Lady Lovelace, translated the article into English and presented a copy of the translation to Babbage. His suggestion that she write notes to accompany the article lead to a series of correspondence between the two. They conceived of the idea of creating a formal description of what an Analytical Engine should compute and Ada Lovelace became the world's first computer programmer when she wrote programs for the as-yet unbuilt Analytical Engine.

Ada Lovelace's death by cancer in 1852 at the age of 36 effectively ended the Analytical Engine project although Babbage continued to work on the idea sporadically until his death in 1871.

Hollerith's tabulating machine

Processing the data from the 1890 U.S. census was going to be a problem. In fact, it was generally recognized that there wasn't any way to process and tabulate the data from the 1890 census fast enough to have the results ready before the 1900 census (i.e. the results would be worthless before they even existed).

Fortunately, Herman Hollerith had a solution. In 1888, Hollerith had invented a tabulating machine which used punched cards quite similar to those used in Jacquard's loom. A key difference, from the perspective of the history of computing, was that Hollerith's machine treated the punched cards as data whereas Jacquard's loom had treated them as instructions. By encoding the information on each person onto a punched card, Hollerith's tabulating machine could and was used to produced tabulated 1890 Census data in just six months.

Hollerith was to lend his name to a form of character constants called Hollerith literals in early versions of FORTRAN. His most lasting contribution is almost certainly the company that he formed, the Tabulating Machine Company, which through a series of mergers became IBM.

De Forest's triode vacuum tubes

Earlier work by Thomas Edison and John Fleming led to Lee de Forest's invention of the triode vacuum tube in 1906. The presence or absence of power on one input could be used to determine if a signal on a second input was passed through to the output. This function, absolutely critical to digital logic, is typically performed by a transistor in modern computers.

Strictly speaking, the level of power on one input determined the amount of the power on the second input that was passed through to the output. This was the contemporary reason for the success of the triode vacuum tube as it made it possible to construct much more efficient analog amplifiers.

The first digital computers

There's still a certain amount of controversy and confusion surrounding who built the first fully functional electronic digital computer. Two clear contenders are:
  • Konrad Zuse of Germany built what was almost certainly the world's first digital computer, the Z1, between 1936 and 1938 (i.e. started in about 1936, completed in 1938). The Z1 was a completely mechanical device (i.e. no electronics). Zuse had been performing research on and developing the notion of what a computer was for some time. He certainly understood by 1934 that a computer would require memory, an arithmetic unit and a control unit. He filed a patent in Germany in 1936 describing such a machine. The patent also describes how instructions in the form of combinations of bits could be stored in the memory of the machine.

    The Z1 is also significant in that it was almost certainly the first device to perform floating point arithmetic.

    Zuse completed the hybrid mechanical/electronic Z2 in 1940. In 1942 he built the Z3 which was an electronic version of the Z1. This was arguably the world's first electronic digital computer.

  • John V. Atanasoff and Clifford Berry of Iowa State University built the Atanasoff-Berry Computer between 1937 and 1942. This computer is also arguably the world's first electronic digital computer (i.e. it depended primarily on electronics as opposed to mechanical devices).

Turing's notion of computability

Alan Turing published On computable numbers, with an application to the Entscheidungsproblem in 1936. This landmark paper described an imaginary computational machine which Turing called an LCM (Logical Computational Machine). Turing used the LCM to define what it meant for something to be "computable". In short, if a problem can be solved using an LCM then it is computable and if it can't be solved by an LCM then it isn't computable. He went on to prove that there were some problems which were not "computable" (see Halting Problem).

LCMs are today called Turing Machines in honour of their inventor.

Computing in World War II

The Second World War triggered an explosion in the development of computing. The contributions to the field during the war were, quite literally, too many to even hope to list. Here are a few of the more significant ones:
  • Alan Turing made major strides in the development of computing while cracking the German codes (see Enigma). This included practical developments like the Bombes used to mechanically attack the Enigma codes.

  • Konrad Zuse continued to develop his mechanical computers for Germany, building the (hybrid electronic/mechanical) Z2 (1940), the (electronic) Z3 (1938-41) and the (electronic) Z4 (1941-45).

  • Vannevar Bush developed a machine called the "Differential Analyzer". Weighing in at one hundred tons (90,000 kilograms), this analog computer operated by implementing "analogs" to actual physical processes.

  • Howard Aitken and Grace Hopper were the designers of the Harvard Mark I which became operational in 1944. The discovery of a dead moth which had caused the Mark I to fail one day led to Grace Hopper coining the term bug.

  • Although not completed until shortly after the war, the development of ENIAC by John Presper Eckert and John William Mauchly was a direct result of the wartime need for computational power primarily although not entirely in the area of ballistics.

  • Computational devices like the Norden Bombsight played a major role in the war although, with the advent of digital computers, I'm going to have to ignore these purely mechanical devices or this w/u will never get finished.

Sidebar: The post-war explosion of computing

With the end of the Second World War, computing technology had reached a level of maturity that was to provide the foundation for a veritable explosion of technology. The next twenty years would see the development of practically all of the major concepts and technologies which make up a modern computer including:
  • the transistor
  • operating systems
  • compilers
  • virtual memory
  • microcode
The remainder of this w/u will focus on a handful of the most major milestones. Feel free to suggest which of the many omitted milestones should also be covered.

The invention of hypertext

Vannevar Bush makes another appearance in this w/u with his 1945 Atlantic Monthly article titled As We May Think. This article discusses how computers might be used to assist humans in the processing of information. It describes a device that Bush calls a Memex which organizes information using the hypertext mechanisms which are familiar to anyone who uses the World Wide Web today.

Hypertext's invention in 1945 is arguably not a "major milestone" as hypertext would have to wait for the creation of the World Wide Web almost fifty years later to become in any sense relevant. It's included in this writeup more as an example of how old some of our "new" ideas really are. It's also a rather striking example of how a truly fundamental idea is pretty much irrelevant until the technology required to implement it actually exists.

The transistor

One of the most important inventions of the 20th century was made in 1947 by William Shockley, Walter Brattain and John Bardeen of Bell Labs. Their transistor, actually two different kinds of transistors, was the result of an effort launched shortly after the end of the war to replace the vacuum tube triode and similar devices with some sort of solid device. Like the triode, a transistor can be used as a digital switch or as an analog amplifier.

The UNIVAC computer

With the end of the war, the U.S. Census Bureau (remember them?) had another incarnation of the same problem of sixty years earlier - there was no way to process the data which would be collected in the upcoming 1950 census in a reasonable timeframe. The Census Bureau turned to the inventors of ENIAC, John Presper Eckert and John William Mauchly, with a contract to build a suitable computer for not more than $400,000US. Eventually rescued financially by Remington Rand Inc. in 1950, the first UNIVAC was delivered to the U.S. Census Bureau in 1951. Forty six UNIVACs were eventually built.

UNIVAC was the first even remotely general purpose commercial computer. The original UNIVAC is in the Smithsonian Institution.

The IBM System/360

After a massive investment of five billion dollars (i.e. 5,000,000,000 1964 dollars) on a literally "bet the company" venture, IBM held a news conference on April 7, 1964 to introduce their new System/360 product line. Suddenly, computers were no longer massive "personal computers" which could run only one program at a time but were now true "business machines" which could be used to run multiple programs or jobs simultaneously. The System/360 was also a real "system" in the sense that it had a well defined system architecture along with a family of products built around the architecture.

The era of the mainframe had arrived!

After premature warnings of the "death of the mainframe" in the early 1990s, IBM's mainframe business has recovered and today continues to generate significant revenue ($US4.2 billion in 2003, an increase of 6% over 2002).

Factoid: As of 2004, roughly 70% of the world's data is still stored on mainframes.

The development of the early ARPANET

It was 1966 and Bob Taylor had a problem and an opportunity. The problem was that he and his people were using a number of different computer systems and it wasn't possible to easily share information between the systems. The opportunity was that Bob Taylor was the newly appointed director of ARPA's IPTO (Information Processing Techniques Office) and was in a position to do something about the problem by putting his influence behind an idea that he'd kicked around with the previous IPTO director Joseph Licklider. The idea was to connect the key computers together using some sort of a digital link or "network".

As the idea developed, it became clear that probably the key piece of as yet non-existent technology was a communications device which was soon called an Interface Message Processor (IMP). The RFPs (request for proposals) went out in mid-1968. By the deadline date, over a dozen replies had been received including one from IBM and another from CDC (Control Data Corporation). Although negotiations with Raytheon began in early December, the winning bidder was ultimately BBN (Bolt Beranek and Newman).

The process of actually developing the first IMPs and getting them operational was far from smooth (see Where Wizards Stay Up Late / The Origins of the INTERNET in the "Sources" below for details). The first IMP was installed at UCLA in September, 1969 and the second was installed at SRI in October. By the end of the year, UCSB had IMP number three and the fourth IMP was in Utah. It took another year to architect and implement a communications protocol called the NCP (Network Control Protocol).

The ARPANET, precursor of the INTERNET, was alive.

The dawn of the personal computing era

The MITS Altair 8800 appeared on the January, 1975 cover of Popular Electronics. The computer was available as a kit for $395 USD and assembled for $495. It had 256 bytes of memory and a 2MHz Intel 8080. The only input device was a set of switches on the front panel. A set of lights on the same panel were the only output device.

The IMSAI 8080 was announced in mid-1975. With a price tag of about $250 USD, it had 4K of memory, an Intel 8080A and a twenty two (22) slot S-100 bus (along with lights and switches on the front panel).

Together with other hobbyist kits and the like, these computers launched the era of personal computing.

Birth of the World Wide Web

Building on the work of Vannevar Bush (hypertext) and others, Tim Berners-Lee launched the World Wide Web in 1990. The first web browser ran on the NeXT system and the first web site was targeted at the High Energy Physics community. It would take a couple of years but by the mid-1990s, the World Wide Web had brought the INTERNET to the masses.

The rest, as they say, is history.

What about ___________?

While it's true that no "history of computing" would be complete without a discussion of Apple, Microsoft, Sun, Cray, Xerox, Unix, MS Windows, MacOS, OS/VS1, etc., it's also true that this is a BRIEF history of computing. Decisions had to be made. Feel free to suggest changes or, probably better yet, write your own w/u under this node.

P.S. Personally, I suspect that there are far more important holes in my coverage of early computing history than in my coverage of modern computing history. Maybe I should require that each suggestion for an addition to the modern computing portion be accompanied by a suggestion for an addition to the early computing portion! (grin)


  • The web page titled located at (last accessed 2003/06/03)
  • The June 23, 2003 article in The Independent newspaper titled Inca may have used knot computer code to bind empire by Steve Connor (Science Editor). Located on the 'net at (last accessed 2003/09/23)
  • The web page titled on Jacques de Vaucanson and his Duck located at (last accessed 2003/06/03)
  • The web page titled Vaucanson's Duck located at (last accessed 2003/06/03)
  • The web page titled A short Biography of Leibniz located at (last accessed 2003/06/03)
  • The web paged titled The pattern loom located at (last accessed 2003/06/03)
  • The web pages titled Mechanical Aids to Computation and the Development of Algorithms, the first of which is located at (last accessed 2003/06/03)
  • The web page titled Ada Byron, Lady Lovelace (1815-1852) located at (last accessed 2003/06/03)
  • The web page titled Introduction to Ada Lovelace's Translation of, and Notes to, Luigi F. Menabrea's "Sketch of the analytical engine invented by Charles Babbage, Esq." (1842/1843) by Christopher D. Green located at (last accessed 2003/06/03)
  • The web page titled Babbage's Difference Engine located at (last accessed 2003/06/03)
  • The web page titled Herman Hollerith's Tabulating Machine located at (last accessed 2003/06/03)
  • The web page titled Inventor Herman Hollerith located at (last accessed 2003/06/03)
  • The web page titled The Invention of the Vacuum Tube located at (last accessed 2003/06/03)
  • The web page titled History in the Computing Curriculum located at (last accessed 2003/06/03)
  • The web page titled Konrad Zuse located at (last accessed 2003/06/03)
  • The web page titled The Life and Work of Konrad Zuse by Horst Zuse, located at (last accessed 2003/06/03)
  • The PDF file titled Z1, Z2, Z3 and Z4 located at (last accessed 2003/06/09)
  • The web page titled Reconstruction of the Atanasoff-Berry Computer located at (last accessed 2003/06/03)
  • The web page titled John Vincent Atanasoff and the Birth of the Digital Computer located at (last accessed 2003/06/05)
  • The PDF file titled Subject-Matter Imperialism? Biodiversity, Foreign Prior Art and the Neem Patent Controversy located at (last ccessed 2003/06/05)
  • The web page titled Computable Numbers, 1936 and the Turing Machine located at (last accessed 2003/06/03)
  • The web page titled Vannevar Bush's Differential Analyzer located at (last accessed 2003/06/03)
  • The web page titled The ENIAC Story located at (last accessed 2003/06/03)
  • The 1945 Atlantic Monthly article titled As We May Think by Vannevar Bush. This article is quite easy to find on the 'net. One copy is located at (last accessed 2003/06/03)
  • The web page titled Transistorized! located at (last accessed 2003/06/03)
  • A series of web pages titled The History of Computers by Mary Bellis, located at (last accessed 2003/06/03)
  • The web page titled IBM's 'dinosaur' turns 40 PCs were supposed to kill off the mainframe, but Big Blue's big boxes are still crunching numbers located at (last accessed 2004/04/05)
  • The book Where Wizards Stay Up Late / The Origins of the INTERNET by Katie Hafner and Matthew Lyon; published by Simon and Schuster; Copyright © 1996 by Katie Hafner and Matthew Lyon; ISBN 0-684-81201-0.
  • The web page titled The MITS Altair 8800 located at (last accessed 2003/06/09)
  • The web page titled The IMSAI 8080 located at (last accessed 2003/06/09)
  • The web page titled The World Wide Web: a very short personal history by Tim Berners-Lee, located at (last accessed 2003/06/09)
  • Personal knowledge.

Log in or register to write something here or to contact authors.