A Story

One thing I remember very distinctly when I was a child was shopping with my dad in the supermarket one day and watching him reach out into the reflective freezer display and grab a pack of reasonably expensive hot dog wieners. The conversation went something like this:

"Hey dad, why did you get those?"
"Those are the good kind."
"I thought the cheapest thing was always the best thing." I didn't really believe this because my dad, like any good Southern dad, only ever bought generic soda. I didn't actually like it, but any time I tried to grab a pack of Pepsi, he would reflexively take it from me, put it up, and grab a pack of Winn-Dixie brand CHEK COLA. I hated it, but he told me that when I understood the value of money, I would prefer the generic soda as well. I was confused by the inconsistency.
"Huh? Not always."
"How can you tell?"
"Well, it's complicated. For most stuff, it doesn't really matter if you get the cheaper version. It tastes the same."
"Not really. I hate CHEK. It's definitely different."
"That's not really the point. It has to do with the way hot dogs are made."
"What are they made out of anyway?"
"Uh, well, that depends on what kind you get. If you get the good kind like these, they're made out of pork. If you get the bad kind, they're made out of whatever's left over from from the pig plus any old meat they have lying around and dead cockroaches and things like that."

In retrospect, I'm pretty sure he just liked the taste of the expensive ones better, but that mental image has stuck with me my whole life to the point that I still recoil slightly when anybody asks me if I want a hot dog (including my father). I imagined a factory with two parts: one side has happy people cutting up and processing the best damn sausage money can buy in an efficient way while the other side is manned by a bunch of overworked and overweight losers who empty out the trash cans from the other side into their own meat grinder and package the result as their version of "hot dogs." By the time both products get to the store, the only discernible factor separating them is price and the average unsuspecting customer is more likely than not going to go for the cheaper variety and never know the horror that awaits them as they plop that mustard and relish on. What's a boy to do?

First Generation Video Game Consoles

I tell this story because it's relevant as an analogy to an unfortunate event that occurred in 1983 known as the Video Game Crash of that year. If you are within a certain age range -- say 21 at the youngest -- and you lived in North America, chances are pretty good that you had a Nintendo Entertainment System at some point during your childhood. Even the name of this ubiquitous piece of 1980s paraphernalia has its roots in the crash. A lot of people can't remember a time before Nintendo, and the concept that video games existed in any form besides Pong or various arcade games prior to the 8-bit savior is somewhat alien. Video games are also widely viewed as a stereotypically Japanese medium, which is probably a carry-over from the prevalent view in the 1980s that the Yen would become the de facto world currency and we would all be speaking Nihongo in no time (my father, as a stock broker, shared this notion; when I was about 6, he recommended that I consider learning Japanese).

Of course, the NES wasn't the first video game console. Hell, it wasn't even the second: Nintendo joined the home video game market after the crash and the NES is considered to be part of the Third Generation of video game consoles. All first generation consoles were born in the USA and were, by today's standards, extremely unsophisticated. They were, by and large, what are called dedicated consoles: that is, they came preprogrammed with a few certain games or, more frequently, with only one game. The first of this batch was the Magnavox Odyssey, which was released in 1972. The Odyssey was quite weak: it produced no sound, it was incapable of keeping scores (it came with pencils and sheets of paper for this purpose), and it was monochromatic. The black and white issue was partially resolved by the inclusion of cellophane panels that the user could tape over his or her television screen to mimic "realistic" backgrounds for different games. Unfortunately, these sheets only came in two sizes, so the effect was somewhat diminished. The Odyssey came preprogrammed with games like the imaginatively named Baseball, Cat & Mouse, and perhaps most significantly, Tennis, all of which had to be activated by inserting cartridges into the system (although the games were already on the system, the Odyssey lacked the requisite power to load them without outside help). The Odyssey also had an optional light gun peripheral, although it wasn't horribly accurate and only had one relevant game.

In 1971, arcade game developer Nolan Bushnell attended a technological conference and got a chance to see the Odyssey in action. Specifically, he was able to get a look at Tennis. Captivated by the simple yet addictive game, he set out to design and release an arcade version of it under his own company's name: Atari. The resultant game had its name changed to Pong in an (ultimately unsuccessful) attempt to avoid legal issues and licensing fees. Over the next couple of years, several companies released dedicated Pong home consoles, much to the consternation of Magnavox. Sales of the Odyssey were slow mainly due to a widespread but inaccurate perception that the console would only work on Magnavox televisions and other companies like Atari and Coleco were able to play on this fear to get better market shares.

Second Generation

Unfortunately, a lot of companies got the same idea throughout the mid 1970s. Anybody who could put together a plastic and metal casing and rig up two paddle controllers released a dedicated Pong system (including, for an extremely brief period of time, Nintendo). Pong, though vaguely fun for a little while, just has certain limits to its staying power. Sure, newer consoles supported things like score keeping and sound, but at the end of the day, you're just bouncing a nonexistent ball back and forth until the first person reaches 7 points. By 1976, these systems were obsolete with the arrival of more powerful systems like the Fairchild Channel F and the Atari 2600. The fact that basically all home video gaming in the United States was based on an outdated concept led to what could be considered a minor video game crash in 1977. Unable to compete with these newer, faster, and more diverse systems, most Pong-clone companies went down like the prom queen. They slashed their prices and gracelessly exited a market in which they were hopelessly outclassed. This small crash (which was really nothing more than an inferior technology being naturally supplanted by a superior one) would presage the bigger crash of 1983.

The defining feature of the second generation of video game consoles was the storage of games on removable cartridges. The Fairchild Channel F, though ahead of its time, was still firmly of the mindset that Pong was the way to go and as such included the game as part of the system's internal memory. This had the effect of coloring perceptions of it as yet another Pong clone despite its somewhat diverse game library. In 1978, the Channel F was discontinued and Fairchild sold its video game development division to another company which attempted to revitalize the system the next year but quickly gave up. Magnavox released its next generation console, the Odyssey 2, in 1978 to great fanfare. It was packaged with a full keyboard in addition to two joystick controllers and was marketed as both a conduit for entertainment as well as an educational aid. The Odyssey 2 sold well in every market except for Japan where video arcade gaming was still more popular.

On a similar note, it was around this same time that the land of the rising sun gave us one of the most important innovations in gaming history: the concept of the protagonist in video games. Most games up until that point had had generic sprites devoid of character or personality, which made sense considering that, for the most part, they were all either sports-oriented like Pong or space shooters like Space Invaders. The Japanese entertainment company Namco created a sensation with the North American localization of its quirky game Puck Man. The name was changed to Pac-Man out of a desire to prevent the obvious vandalism that would have erupted on most of the game cabinets across the nation. Pac-Man was unique for three reasons at the time: first, the player was given a character with something of a personality and a sense of emotional investment, however rudimentary, in his well-being; second, the game injected a good amount of intentional humor into itself to break up the monotony of what was essentially endless iterations of the same basic concept (the stages in Pac-Man are all identical except for the point values and the power levels of the enemies); and finally, the game appealed to more than the stereotypical gaming crowd (i.e., the girls liked it too). The race was on to figure out who would gain the exclusive home console licensing rights to this cultural phenomenon.

Signs of Decay

Despite how good things were looking overall, obvious fissures in the industry were beginning to develop. Aside from the aforementioned bottoming-up of several so-called "developers" (Pong-cloners), 1979 was a landmark year in terms of the way video games were conceived of in creative terms. Several Atari programmers were extremely disenchanted with their company's lack of recognition for their work (no royalties, no individual mention of their contributions to various games, etc.) and left to form the first third-party video game development company, which they named Activision. While this had the effect of allowing game programmers to get the credit they deserved for their work, it also sort of opened the floodgates for all sorts of third party developers with far less prestigious pedigrees to try their hands at making games for the big systems. But more about them later.

Atari and Magnavox got some more competition in the early 1980s from several sources. In 1980, Mattel, the toy company most famous for creating the popular Barbie line, launched its own home video game console, the Intellivision. The Intellivision was graphically superior to the 2600 and had more games than the Odyssey 2 (which suffered from a lack of decent third party titles). This three-way competition was not entirely dissimilar to situation we see today with the Nintendo Wii, the Microsoft Xbox 360 and the Sony Playstation 3. This was a healthy time for home video gaming, but only briefly. The room started to get increasingly more crowded when companies like Milton Bradley and the radio-manufacturer Emerson released their own consoles to little fanfare. Not only that, Coleco decided to get back in on the action with the Colecovision, the successor to their first-generation dedicated console called the Telestar.

Atari was suffering as a result of all this competition and for the first time, started losing ground. The 2600, while still popular, was no longer on the technological cutting edge. In 1981, however, the company scored a great coup when it was awarded the much-coveted North American console distribution rights to the Holy Grail of video gaming: Pac-Man. The Colecovision did well the next year after having gained the rights for the popular arcade game Donkey Kong, making it the first home console appearance of a certain Italian plumber. Magnavox was hurting somewhat over the loss of the rights to Pac-Man, but had not forgotten that Atari had made their bones by stealing their tennis game. Magnavox created a Pac-Man clone called Munchkin, which was identical in virtually all respects to the real thing. To add insult to injury, Munchkin was released a year before Atari's console port of Pac-Man was ready, prompting a lawsuit that had all unsold copies of Munchkin pulled from shelves.

It was around this same time that console manufacturers started having real problems with third party quality control and unauthorized games appearing for use on their consoles. Sure, this was the same era that brought us classics like Pitfall and Frogger, but it also saw the advent of companies that had no real business in getting involved in video gaming doing just that. No-name, z-grade organizations popped up trying to cash in quickly on the video game fad. Quaker Oats, of all companies, incorporated U.S. Games as their video gaming division and created such masterpieces as EggoMania (not related to the delicious toaster waffles) and Name This Game, which was the subject of a never-finished contest to (duh) name it. The Atari 2600 was the recipient of most of these of types of games because it was (a) the easiest one to program for and (b) it had more market saturation than other consoles (despite naturally lagging sales at the end of its life cycle).

The problem of unauthorized software was not exclusive to Atari, but it did disproportionately affect public perceptions of the company. This is perhaps best typified by a somewhat obscure developer called Mystique. Mystique tried to corner what was a then-untapped market: the erotic video game. Now, I'll let that sink in for a minute. Even today, it's almost impossible for games with sophisticated CGI graphics to be considered "erotic" by most standards, so imagine what it was like back in the days when eight whole bits were a big deal. Mystique published such gems as Custer's Revenge, where the player portrays a nude George Armstrong Custer with a massive polygonal erection. The goal of this game is to repeatedly rape a Native American woman tied to a pole without being hit by falling arrows. Another game, Knight on the Town, features a knight trying to save a princess for the purpose of having sex with her. In order to accomplish this, the player must brave randomly falling pieces of the floor and giant crocodiles trying to bite the knight's penis off. Yeah. Needless to say, these sorts of titles put an unsavory face on what was already considered to be a sufficiently mind-numbing medium.

To be fair, Atari's first-party games weren't really anything to write home about at this point either. The company had released the Atari 5200 in 1982, marking their first serious attempt at giving their home console line a much needed update in six years. The problem with the 5200 was that it wasn't so much a video game console as it was an Atari 800 personal computer minus a keyboard and plus a few badly made peripherals and only a handful of games. Conspicuously absent from the 5200 was any sort of backward compatibility with the 2600, which alienated a large segment of Atari owners. Why buy a more expensive piece of hardware that lacks any decent new games but that won't play the old ones? Atari's software development focus self-defeatingly returned to the 2600. 1982 was supposed to be a watershed year for Atari: not only was their port of Pac-Man due for release, but so was their exclusive tie-in based on Steven Spielberg's incredibly successful film E.T.: the Extra-Terrestrial. Atari announced in October that the game would be ready by Christmas that year, which meant the entire game had to be written, programmed, produced, tested, finished, and shipped in about a six-week timeframe. Pac-Man was slated for release first, but when it finally hit stores and people took it home, it was barely recognizable as such. Even by the standards of 1982, Pac-Man was not a tremendously sophisticated game, but the Atari port suffered from innumerable problems. The screen was rotated 90 degrees from the familiar arcade set up and the colors were completely off. Not only that, the sprites were jagged and the ghost enemies took up so much memory that to compensate, they were frequently invisible, meaning that the player would often be unable to tell where they were at or where they were going until it was too late. The sound was also hilariously awful, being far off from the arcade version (which, again, was not particularly advanced). Pac-Man sold very well despite these issues, but it was due to the strength of brand-name recognition and many people were quite rightly offended by the low quality of the port. About 70% of all ten million 2600 owners bought the game, which by any standard is highly impressive...until you realize that Atari produced about twelve million cartridges, leaving an astonishing five million unsold copies of the game.

Atari's E.T. experience is in a category of its own. People weren't exactly thrilled with Pac-Man, but they bought it anyway. Atari grossly overestimated how much enthusiasm there was for video gaming in general at this point, never mind how much enthusiasm there was for an E.T. game. It was a semi-nonsensical adventure game with redundant and frustrating gameplay. It sold about 1/3 of its total units, many of which were returned to retailers and then in turn returned to Atari. Financially, this was a double whammy since Atari had to reimburse retailers for unsold/returned product and bite the hefty licensing fee they paid for the game. Rather than try to salvage the game, Atari in 1983 utilized the services of a landfill in New Mexico and simply buried and destroyed millions of unsold pieces of 2600 and 5200 hardware and software. Atari went bankrupt the following year and had to be disassembled and liquidated.

The End

Atari's effective demise in 1983 was paradoxically bad for their competitors. Atari's software development division had long been making games for other consoles, and the loss of their input represented a serious blow to the Colecovision and Intellivision. Retailers were also at a loss as to what to do with all these games that weren't moving out of their stores; with at least seven consoles to support, stores were flooded with product. Prices had to be slashed and the cheaper titles were, for obvious reasons, the best sellers. Unfortunately, the cheaper (mainly third party) titles were almost universally awful and people stopped caring about home video gaming. One after the other, these third party developers stopped making games and the console makers, when they had them, returned to their other business interests.

The final problem that home video gaming faced was from home computing. Commodore was smart enough to see that things were changing in the 1980s; out was the 1970s mindset of fun and entertainment and in was the focus on preparation for success and material gain. Commodore successfully capitalized on the perception that video games would rot the brain of the average American child but that their home computer (which was oh-so-reasonably-priced) would give little Johnny an important edge in getting into college and becoming a real stockbroker who could go toe-to-toe with those rascally Japanese firms that were taking over the country. Coincidentally, these affordable home computers also played video games that were sufficiently more advanced and more interesting than most of what was available for other consoles. When the Commodore 64 lowered its price to $200 (a 66% price drop), it was more affordable than any other home computer or video game console at the time, and it killed all the competition. By the end of 1983, home video gaming was considered a fad best forgotten, and people went back to playing games in arcades.

The Beginning?

Obviously home video gaming didn't completely die out. Less than two years later, the Japanese company Nintendo revitalized the industry in North America with its Nintendo Entertainment System and business has been going strongly ever since. Nintendo's success can only be measured in the context of the video game crash of 1983 and the lessons learned from it. Their console's name is indicative of a desire to placate retailers who were unsure about taking another gamble on video gaming; by packaging the NES as a piece of generic "entertainment" hardware and bundling early units with the useless R.O.B. plastic robot toy peripheral, Nintendo was hoping to avoid guilt by association with the now-fallen Atari and Odyssey. As Nintendo saw it, the two biggest factors that led to the crash were (a) the lack of quality control in the development of games and (b) the overabundance of video game consoles. Nintendo solved these problems in a very simple manner: anybody wanting to develop a game for Nintendo had to agree to an exclusivity contract prohibiting them from selling the same game to another company for a certain period of time or face punitive action. Nintendo also installed a lockout chip in their consoles which prevented unauthorized developers from making games for it. In practice, the 10NES lockout chip wasn't that sophisticated, but it scared off a fair number of even less sophisticated programmers and those that dared to crack it faced swift legal retaliation from Nintendo, often driving these fledgling companies out of business immediately (despite the fact that it was later determined that Nintendo had no legal basis to sue people who did this). Nintendo could then give every game on their system their own personal seal of quality, which inspired consumer trust and brand loyalty, two things noticeably missing from the second generation of video game consoles.

These practices allowed Nintendo to dominate the third and fourth generations of video game consoles. It was Nintendo's resistance to changing technological preferences that let the Sony Playstation overtake the Nintendo 64 and which in turn put the Nintendo Gamecube in a distant third place behind the Playstation 2 and Microsoft's original XBox. Nintendo's newest home console, the Wii, was released in 2006 to great fanfare and currently has sold more units than either the Xbox 360 or the Playstation 3. The Wii appeals to the casual gamer with its emphasis on affordable hardware, uncomplicated gameplay for new titles, access to classic titles such as the original Super Mario Bros. via its Virtual Console service, and its unique motion detection controllers. Ironically, the Wii is in a similar place to the Atari 2600 in 1982: it lacks the processing power and the graphical capabilities of its direct competitors and its library is becoming bloated with inferior third-party titles like Chicken Shoot and Ninja Bread Man and features poor ports of popular Playstation 2 games like Prince of Persia: the Two Thrones and Call of Duty 3. First party titles for the Wii (such as Super Smash Bros. Brawl or Legend of Zelda: Twilight Princess) are usually great games, but the Official Nintendo Seal is beginning to mean less and less. Whether Nintendo is becoming complacent in its success has yet to be determined as does whether they actually learned the lessons of 1983 or just passively reacted to them.

Log in or register to write something here or to contact authors.