No, I'm not going to fill the screen with digits. We all
know that's a bad idea, and therein lies the tale.
People's earliest experiences with representing numbers had them duplicating
the one-to-one correspondence between something they wished to count
and the number of things, with a similar count of some analogous object
-- fingers, toes, pebbles, notches in a stick, tally marks, what have you.
At some point the numbers involved became so large that this method
was impractical. With the invention of writing, people invented symbols
to stand in for numbers, and ways to combine the symbols to represent big
numbers. This eventually developed into the familiar decimal
positional notation used for calculation by most of the world's literate
population today.
However, it's not too hard to see that beyond a certain amount, decimal
positional notation begins to lose its expressive power. The sheer
tedium of writing down a 35,000 digit number, the amount of paper required
to write it down on, coupled with the possibility that the writer might
lose his or her place and have to start over, means we have to find another
solution.
Another form of combining symbols lets us go a little further:
We can represent a really big number in terms of some lower numbers
raised to an exponent. It's far easier to write "2^2048"
than "32 317 006 071 311 007 300 714 876 688 669 951 960 444 102
669 715 484 032 130 345 427 524 655 138 867 890 893 197 201 411 522 913
463 688 717 960 921 898 019 494 119 559 150 490 921 095 088 152 386 448
283 120 630 877 367 300 996 091 750 197 750 389 652 106 796 057 638 384
067 568 276 792 218 642 619 756 161 838 094 338 476 170 470 581 645 852
036 305 042 887 575 891 541 065 808 607 552 399 123 930 385 521 914 333
389 668 342 420 684 974 786 564 569 494 856 176 035 326 322 058 077 805
659 331 026 192 708 460 314 150 258 592 864 177 116 725 943 603 718 461
857 357 598 351 152 301 645 904 403 697 613 233 287 231 227 125 684 710
820 209 725 157 101 726 931 323 469 678 542 580 656 697 935 045 997 268
352 998 638 215 525 166 389 437 335 543 602 135 433 229 604 645 318 478
604 952 148 193 555 853 611 059 596 230 656". This, combined with the
fact that a relatively few significant digits carry enough meaning for
most purposes, results in the useful techniques of logarithms and scientific
notation.
But in our quest for even larger and larger numbers, exponents too become
insufficient. Of course, we can chain exponents, such as a^b^c^d.
Since exponentiation is not associative, we adopt a convention of right-associativity
for this, i. e. a^b^c^d = a^(b^(c^d)). This will
get us a long way: 10^10^10^10 is a really really big
number.
So, what have we learned? In order to represent ever larger
numbers within a practical amount of time, we are forced to squeeze ever
larger amounts of information into smaller and smaller spaces, by abstracting
information away: We allow some combination of symbols to stand in for
a whole class of representations. The more layers of abstraction we can
pile on, the larger the numbers we can represent.
We eventually come to the point where abstracting the abstractions becomes
necessary. One way of doing this involves drawing geometric
figures around numbers to represent a larger number produced by some algorithm,
with which we can produce some (really) big numbers (follow the link).