As opposed to what most people (like KuRL in his wu) said, the Y2K "bug" was not the result of trying to save space. In fact, storing dates as series of decimals is a waste of space as opposed to using binary integers. No, the Y2K problem was, for the most part, the sole result of design flaws in COBOL, which for some braindead reason used 2 decimals for the year as its standard date format. And it was the result of programmers' misconception that software is retired in the same rate hardware is (i.e. every 3 to 5 years). This may the case for word processors, but it's definitely not the case with business-critical transaction processing systems or the like, running on big iron, where switching over to new software means that you can't do any business if it doesn't work right away, and that you will have very angry customers if there are undiscovered bugs. Also, that kind of software tends to be written to customer specification and therefore extremely expensive. You don't spend that kind of money and run that kind of risk if the old software still works. And so, it keeps running, on the 10th cycle of new hardware, with 3 layers of emulation underneath, for decades.