Most people think computer science is something to the effect of *the study of computers*. But it's not. It's really the study of computation: *what can be computed*. Of course everything that can be computed can be computed via a Turing Machine (a DFA and a tape).

So zooming in a little, we have have to figure out *how exactly to go about computing things* (the DFA), and *how to represent information* (the tape). Incidentally, this is why computer geeks go around making CS metaphors about every real-life thing (i.e., the conversation stack, binary search) -- because the majority of real life is dealing with information and solving problems.

How we actually go about implementing our Turing Machines (or close approximations thereof) -- and making them easy to use -- is left as an exercise to the reader. I see it as purely incidental to the science. The fact that we have these things called "transistors" and "keyboards" and "operating systems" that work in whatever way they do might be completely different if we rewound time and started the whole computer-invention process over again. But one thing wouldn't change, which is the essence and true nature of computer science: the computable.