The Advanced Micro Devices Duron, a Socket 462/A CPU, was released in April 2000 and first shipped(*) on June 5th, 2000. The AMD Duron was the budget version of the AMD Athlon, slower and cheaper. Just as Intel's Pentium IIIs had the Celeron, the Athlon had the Duron. While the Duron received what can only be described as moderate fanfare (it shipped on the same day AMD released the Athlon Thunderbird, a processor that would become famous for its low price and high speed), it still had quite a large market to cater to, and hardly went ignored. It has been replaced with the Sempron.



Spitfire
This first Duron used the Spitfire core(**), which was based on the Athlon's Thunderbird core. It had a 100MHz frontside bus, but it used the Alpha EV6 bus, meaning that it effectively had a 200MHz bus speed. It had 128KB of L1 cache (same as the Thunderbird in every way; both are 2-way set associative(`)) and 64KB of L2 cache (the Thunderbird had 256, but both are still 16-way set associative(`)). The Thunderbird would also later scale to a 133MHz (effectively 266MHz) FSB while the Duron was left at 100MHz. While no cost savings were made on the lower bus speed, less cache meant the CPU die was smaller, which meant they could make more CPUs per wafer, which meant lower production costs, which meant a cheaper CPU for the end user.

Initially released at 600MHz, 650MHz, and 700MHz(***), the Duron's job was to compete with the Coppermine 128 Celeron. The Celeron was still operating with a 66MHz FSB at this time, while the Duron effectively had a 200MHz FSB. The Celeron did have more L2 cache (128KB to the Duron's 64KB), but it had only 32KB of L1 cache (as did the Pentium III). Also, the Celeron's L2 cache was alarmingly inefficient; Intel was basically taking Pentium IIIs with partially nonfunctional cache and selling them as Celerons, which meant the L2 cache was 4-way associative. The Duron's L2 cache was 16-way associative, if you're wondering.(`) Also, the Celeron's was inclusive, meaning that the L1 cache had to be mirrored in the L2 cache, essentially leaving it with 96KB of L2 cache instead of 128KB. The Duron's was exclusive. Taking all of this into account, it's not hard to see why the Duron beat the pants off of the Celeron.

Every time the Celeron was pitted against a Duron of equal clock speed, the Duron performed anywhere from 10% to 30% better in both artificial (SYSMark 2000) and real-world (Quake III) benchmarks. Additionally, the Duron was cheaper. All the Celeron had left was its almost-legendary overclocking ability-- and the Duron was shaping up to be quite the overclocking chip as well. What's more, if you had a pencil and five minutes, you could unlock the multiplier, meaning that you no longer had to rely on risky FSB adjustment to overclock. (See flamingweasel's writeup if you want to unlock.) Thanks to the Duron Spitfire, the only market for the Celeron had become stupid people.(``) The Spitfire core would later scale all the way up to 950MHz, and even when the Celeron was finally given a 100MHz FSB with the Celeron 800, it could not compete.


Morgan
The Duron 1GHz, released in August 2001, wasn't the same as the Durons before it. It was based on the new Morgan core, which was the budget version of the Palomino core. If you don't recognize the name, the Palomino was the first-generation Athlon XP, meaning that the Morgan had all the architecture improvements of the Athlon XP (SSE support, data prefetch, and many more). This meant that instead of having the small ~5% performance boost you'd expect from a 50MHz clock upgrade, the new Duron at 1GHz was about 10-15% faster than the Spitfire at 950MHz.

But the market had changed. More specifically, the Athlon was now available at obscenely low prices. So even though the Duron Morgan was a significant improvement over the Spitfire, its market was considerably smaller. In the days of the Spitfire, the large cost difference between the Athlon and Duron had made the Duron a worthwhile choice; now, you could simply pay a few dollars more (about $20 US) and get a significantly faster Thunderbird. The end of the Duron was near. (And never mind the Celeron, it was priced at about the same level as some slower Thunderbirds. I hold that it should have died long ago.) AMD scaled the processor up to 1.3GHz in 100MHz increments, then simply stopped bothering, as the low-end Athlons offered superior speed at a very small price boost. And so the Duron quietly faded away...


Applebred
... until in August 2003, AMD suddenly announced that they would release the Duron at 1.4GHz, 1.6GHz, and 1.8GHz, much to the confusion of the hardware community. The new "Applebred" Durons are based on the Thoroughbred-B core, with 192KB of the L2 cache disabled. The Applebreds also have a 266MHz frontside bus, same as the lower-end Athlon XPs. Lastly, the shift from .18-micron to .13-micron makes Applebreds cheaper to produce and helps heat output (smaller transistors can run at lower voltages).

What this doesn't explain, of course, is why AMD bothered. After many, many people asked what was going on, AMD explained that there was still a market in developing countries such as Russia and Mexico for a low-cost chip with reasonable performance. The Duron is famous in places like these for its low price and good performance, and AMD want to make sure that they're covering every single market available.

As noted earlier, Applebreds are Thoroughbred-Bs with 192KB of their L2 cache disabled. Some bored Russians (it's always the Russians) figured out how to enable all of the L2 cache, but warned that it didn't always work. AMD is probably selling Thoroughbred-Bs with partially nonfunctional L2 cache as Applebreds. This isn't always the case, of course, else these enterprising Russians wouldn't have been able to enable the rest of the L2 cache on theirs; I suspect that some Applebreds are semiborked Thoroughbred-Bs, and others might just be relabled Thoroughbred-Bs that haven't had all of their L2 cache tested and have only 64KB of it enabled so they can be sold as Durons. More information will be available as soon as I can find out just how often people can enable the cache on their Applebreds. Wow, I wish I spoke Russian.

For those who are curious about the Duron's performance, the answer is simple: it's the same as always, roughly 80% (occasionally 95%, occasionally 60%) as fast as an Athlon (well, Athlon XP) at the same clock speed. These chips are now available in countries such as the United States and United Kingdom, so if you're curious about the new Duron for whatever reason and live in the US or Western Europe it should be easy to get one for the time being (although they'll be getting trickier and trickier to find as time goes on).

The Applebred Duron was also the last Duron ever produced. AMD's next attempt at a budget line was called the Sempron.





* Hardware usually ships a few weeks after release, and with "paper launches", the gap between release and ship date has been steadily widening. There are some exceptions, such as the Intel 865 chipset, which was actually available before its official release through normal retail channels. A refreshing change of pace, to say the least.

** Cores (the "hearts" of CPUs) are always given names while they're in development, and hardware people like to call their processors by their cores (such as "Katmai 500") to avoid confusion; there are usually multiple cores for each processor, and there are notable differences between each type of core. Intel usually name theirs after random places in California, and lately AMD have been naming them after horses. I still say that "Spitfire" and "Thunderbird" sound way better than "Palomino" and "Morgan," but I guess they don't feel the same way. @&#!ing horses.

*** New CPUs tend to be released at three different clock speeds exactly. One means they're having serious yield problems, two means they're cocky, four or more means they're worried.

` If you would like to know more about what exactly this means, AnandTech explains it briefly here: http://www.anandtech.com/cpu/showdoc.html?i=1252&p=5

`` Okay, stupid is harsh; after all, not everyone cares about their CPU. But I feel that if you're going to blow a thousand bucks on a computer, you might as well make sure it's not a piece of crap. Would you buy a car without checking anything or asking anyone first? I'd certainly hope not.