The Megahertz Myth developed when computer advertising emphasised the clock speed of the CPU whilst ignoring other elements in the CPU's operation such as power consumption and per cycle performance. The belief was promoted that the faster the clock speed, the better the performance of the computer. In truth, two computers with different CPUs of identical clock speed could have quite different levels of performance.

In 1994 Apple Computer introduced a new chip that gave it a marked performance lead over Windows based machines. However, by late 1997, this lead was lost when the Windows platform introduced a new chip with improved performance and a higher clock speed. Consequently Apple was left to promote the limited advantage it retained in the field of laptops.

By 2000 the belief that clock speed was all-important had begun to influence the development of new chips, so much so that chip manufacturer Intel sacrificed per-cycle performance for higher clock speed.

In July 2001, Apple introduced a new chip, whose clock speed was about half that of its Intel manufactured equivalent, but which nevertheless significantly outperformed it. Apple needed to embark on a massive advertising campaign to explain that CPU performance involved more than clock speed. In doing so, the term 'megahertz myth' was coined.

It turned out that both the chip used by Apple and the Intel chip then used by the Windows platform were unsatisfactory for the further development of laptop machines. It was Intel that developed a chip with a significantly new structure that was capable of resolving laptop issues, and the megahertz war came to an end when in 2005 Apple announced that it would switch to Intel developed chips.

More Information