What Is Moore's Law?
Moore's Law asserts that the number of transistors on a microchip doubles every two years, though the cost of computers is halved. In other words, we can expect that the speed and capability of our computers will increase every couple of years; and we will pay less for them. Another tenet of Moore's Law is that this growth in the microprocessor industry is exponential—meaning that it will expand steadily and rapidly over time.
Understanding Moore's Law
In 1965, Gordon E. Moore—the co-founder of Intel (NASDAQ: INTC)—postulated in a magazine article that the number of transistors that can be packed into a given unit of space will double about every two years. (Now, however, doubling of installed transistors on silicon chips occurs closer to every 18 months instead of every two years.) Gordon Moore did not call his observation "Moore's Law," nor did he set out to create a "law." Moore made that statement based on noticing emerging trends in chip manufacturing at Intel. Moore's insight became a prediction, which in turn became the golden rule known as Moore's Law.
From Prediction to Truism
Moore's Law proved to be true. For decades following Gordon Moore's original observation, Moore's Law has guided the semiconductor industry in long-term planning and setting targets for research and development (R&D). Moore's Law has been a driving force of technological and social change, productivity, and economic growth that are hallmarks of the late-twentieth and early twenty-first centuries.
Moore's Law—Nearly 60 Years, Still Strong
More than 50 years later, we feel the lasting impact and benefits of Moore's Law in many ways.
Moore's Law implies that computers, machines that run on computers, and computing power all become smaller and faster with time, as transistors on integrated circuits become more efficient. Chips and transistors are microscopic structures that contain carbon and silicon molecules, which are aligned perfectly to move electricity along the circuit faster. The faster a microchip processes electrical signals, the more efficient a computer becomes. Costs of these higher-powered computers eventually decrease by about 30% per year because of lower labor costs.
Practically every facet of a high-tech society benefits from Moore's Law in action. Mobile devices, such as smartphones and computer tablets would not work without tiny processors; neither would video games, spreadsheets, accurate weather forecasts, and global positioning systems (GPS).
Moreover, smaller and faster computers improve transportation, health care, education, and energy production—to name but a few of the industries that have progressed because of the power of computer chips.
- Moore's Law asserts that the number of transistors on a microchip doubles about every two years, though the cost of computers is halved.
- In 1965, Gordon E. Moore, the co-founder of Intel, made the observation that became Moore's Law.
- Another tenet of Moore's Law is that the growth of microprocessors is exponential.
[Important: Moore's Law may reach its natural end in the 2020s.]
Is Moore's Law Dead?
Experts agree that computers should reach the physical limits of Moore's Law at some point in the 2020s. The high temperatures of transistors eventually would make it impossible to create smaller circuits. This is because cooling down the transistors takes more energy than the amount of energy that already passes through the transistors. In a 2005 interview, Moore himself admitted that his law “can’t continue forever. It is the nature of exponential functions," he said, "they eventually hit a wall."
Shrinking transistors have powered advances in computing for more than half a century, but soon engineers and scientists must find other ways to make computers more capable. Instead of physical processes, applications and software may help improve the speed and efficiency of computers. Cloud computing, wireless communication, the Internet of Things, and quantum physics all may play a role in the future of computer tech innovation.
The vision of an endlessly empowered and interconnected future brings both challenges and benefits. Privacy and security threats are growing concerns. In the long run, however, the advantages of ever-smarter computing technology ultimately can help keep us healthier, safer, and productive.
Examples of Moore's Law
You and I
Examples of Moore's Law abound everywhere we turn today. For instance, you likely have experienced the need to purchase a new computer or phone more often than you thought—say every two-to-four years—either because it was too slow, would not run a new application well, or for other reasons. This is a phenomenon of Moore's Law that we all know well.
Perhaps, however, Moore's Law—or its impending death—is most painfully present at the chip manufacturers themselves; as these companies are saddled, not only with making our computing chips but building them with increasing capacity against the physical odds. Even Intel is competing with itself and its industry to create what ultimately may not be possible.
In 2012, with its 22-nanometer (nm) processor, Intel was able to boast having the world's smallest and most advanced transistors in a mass-produced product. In 2014, Intel launched an even smaller, more powerful 14nm chip; and currently, the company is struggling to bring its 10nm chip to market.
For perspective, one nanometer is one-billionth of a meter, smaller than the wavelength of visible light. The diameter of an atom ranges from about 0.1 to 0.5 nanometers.