Computer Science for Smart People

the first block

As opposed to the for dummies concept, I will try to make things more difficult by pointing out connections and making you see the tradeoffs, the political reasons, the ugly causes and the confusion about the future.
It is not cut and dried. If it were, I would not be doing it
. In case you are wondering, I am writing this as base material for a course in Computer Science that I am teaching to non-CS students (mostly coming from a design background).

Lesson 1

Historical Roots

It is difficult to know how much one should go back in history. Since I have no sense of measure, let us start with Aristotle. We are indebted to him for many things and concepts, but the one I am interested in right now is the syllogism (or, more in general, logic).
Even if Aristotle would not probably have seen it like that, the cool bit about the syllogism is that it works like a little truth-spewing machine. If you feed it correctly, with just the right form of statement, it will spit out a truth for you.

The next interesting things happen in medieval logic, particularly among the scolastic school. They developed logic and built an interesting classification of the syllogisms, even producing a compact and meaningful naming system (that reminds one disturbingly of function prototypes). They went as far as applying logic to the greatest problem they knew, namely the Existence of God; we have 5 rather interesting proofs.

Logic, and the ordered world-view, went through the Renaissance crises and arrived at the end of the of 19th century in very good shape. The then-prevalent world view, that we may call Lord Kelvin's universe, was ordered and reasonable. Some really hairy differential equations could explain the world of light, heat, electricity and physical phenomena. Mathematics was an endless mine of beauty, self consistent (as set in the Principia Mathematica). George Boole had formalized logic operators (AND, OR, NOT) and devised truth tables. Gottlob Frege (a mathematician and a philosopher) had formalized logic even further, setting the bases of predicate calculus.
A really strange guy named Charles Babbage had even the peculiar idea of building a programmable mechanical device that would compute; but he could not build it. Read something related under "steampunk".

The turning of the century:The quantum theory spanner was thrown in the works of Physics by Niels Bohr and the Copenhagen School; and the great Kurt Gödel proved some really nasty things about completeness (or lack thereof) in mathematical systems.

That same revolution in physics lead - eventually - to understanding the properties of semiconductors (and to many other things, including lasers and nuclear energy). Semiconductors are not essential to computing, and actually the first computers employed mechanical relay switches and thermoionic valves. But semiconductors can be made very small and very cheap. Currently the price of a transistor in an IC can be a tiny fraction of a USD cent.
On the theory side, it was the genius of Alan Turing that developed the link between mathematical functions (and in a sense, most problems can be seen as the act of computing a mathematical function from N to N) and computing machines. This link is the Turing machine, a theoretical programmable device.
Interestingly, Turing (a mathematician) developed his concepts before the physical machinery to test them was available.

The idea of a variable program running on a fixed piece of hardware was not new. The Jacquard loom did that. The Jaquet-Droz automata did that -before Babbage- as well. The addition was the conditional jump, that allows loops and thus universal computation.

On the practical side, we must observe the fact that the first applications of computing were entirely military in nature; cryptology, ballistics and nuclear explosion related simulations (implosion and initiator design) towards the end of WWII.
Where did innovation come from? it is to military and intelligence development that we owe the development or the invention of some of the key technologies that we still use today, including VR, the Internet and Operations Research.
A lot came out of some specific resarch centers like the Xerox PARC (Ethernet,X Windows, the computer mouse, WIMP interfaces), Bell Labs, IBM (a lot of work on hard disks, memory, algorithms) MIT, UCSD (UNIX), CMU, Cornell, Monash University and universitities in Europe.
Something (mostly applied tech rather than basic technologies) from ad-hoc assemblages of people (many Internet protocols and services).

How Rocket Science became Computer Science first, and then COMDEX

The innards of a computer did not change much in the last 30 years. And the concepts behind them did not change much in about the same time. There have been technological changes (the 8" floppy disk being replaced by the 5 1/4" floppy disk being replaced by the 3 1/2" floppy disk being replaced by... nothing, or maybe by CD-W) and things have gotten better - better video cards, the addition of sound cards, bigger mass memory.
Bigger, better, faster... but not different, really. We still have Von Neumann's machines running under the hood, moving data from core memory (AKA RAM) to mass memory. We still have periphals. We still have interrupts.
Evolution in computer science is a process that does not happen quite as fast as marketing would like. Even computer systems hailed as NEW!!! are not: consider the Mac OS X, and tell me what is new in it. If you say UNIX, though, I will have to kill you.

In a sense, what has happened in the last 20 to 25 years is a wave of uniformation. If you look at a copy of BYTE magazine from 1982, you will see the first wave of the Macintosh, a variety of operating systems (CP/M, MS/DOS, Windows, varieties of UNIX, the MacOS...), strange and bizarre hardware, a lot of it geared towards hobbists; underneath the BYTE radar, the home computer world was teeming with things like the VIC20, Sinclair machines, the Amiga, MSX and basically a lot of strange stuff. Mentions of object oriented programming. People using Ethernet. Modems costing 500 USD.
Now BYTE has died. But if you read those old issues you will notice that:

  1. We still have the same problems (SGML would solve many of them, but we still don't use it).
  2. PC makers still use the adjective "ultimate" and the noun "speed".
  3. Ads now are more professional-looking but probably less fun.
  4. Diversity has diminished.
  5. We have discovered the existence of the Internet (it was already there in 1982, but not many people had noticed).

Yet, it would be incorrect to claim that nothing happens in the computer world: if we ignore the illusion of progress that marketing would have us believe, we can see that, for example, the development model behind Linux is new (even if Linux is not new at all, as a concept, being the reinvention of UNIX). We can see that there is something new in smaller systems, things that we not even consider a computer: the Palm Pilot for example - here the concept (although not new) has been brought to a usable maturity level. And it has no mass storage. Lossy compression has been interestingly new and resulted in a useful technology (as opposed to fractal compression).
We have also learned that, in the computer jungle, the licensing model and the marketing policies of a particular combination of hardware and software can be more important than the technology excellence.

In this lesson I also presented: Design by Numbers, a graphics oriented small language, excellent for teaching. The DBN assignment was "write 20 lines of code, including at most two procedures, that produce an interesting display (for some definition of interesting)". Many people in the class have never written a line of code. The assignment was completed satisfactorily, I would say.

You want to read this: Today Is the Tomorrow You were Promised Yesterday : 200 years of Information
and probably Chomsky Hierarchy as well.
a timeline at: http://inventors.about.com/library/blcoindex.htm.
There is a great book whose title is Code: The Hidden Language of Computer Hardware and Software. It is a chunky, solid, well written book. Reccomended.

lessons learned: some examples can be more confusing than what they try to explain. Check early and often if the students are still alive and listening. Do not suppose that everybod knows exactly what a real number, an integer number or a circle are.

If this is all absurd and boring and blood is coming from your ears, remember that this is just what happens when professors try to be funny.

----> Zoooom forward to Evolution of the Computer.

Log in or register to write something here or to contact authors.