What Is Moore's Law?
Moore's Law refers to Moore's perception that the number of transistors on a microchip doubles every two years, though the cost of computers is halved. Moore's Law states that we can expect the speed and capability of our computers to increase every couple of years, and we will pay less for them. Another tenet of Moore's Law asserts that this growth is exponential.
Understanding Moore's Law
In 1965, Gordon E. Moore—co-founder of Intel (NASDAQ: INTC)—postulated that the number of transistors that can be packed into a given unit of space will double about every two years. Today, however, the doubling of installed transistors on silicon chips occurs at a pace faster than every two years.
Gordon Moore did not call his observation "Moore's Law," nor did he set out to create a "law." Moore made that statement based on noticing emerging trends in chip manufacturing at Intel. Eventually, Moore's insight became a prediction, which in turn became the golden rule known as Moore's Law.
From Prediction to Truism
In the decades that followed Gordon Moore's original observation, Moore's Law guided the semiconductor industry in long-term planning and setting targets for research and development (R&D). Moore's Law has been a driving force of technological and social change, productivity, and economic growth that are hallmarks of the late-twentieth and early twenty-first centuries.
Moore's Law implies that computers, machines that run on computers, and computing power all become smaller, faster, and cheaper with time, as transistors on integrated circuits become more efficient.
Moore's Law in Action: You and I
Maybe you have experienced (as I have) the need to purchase a new computer or phone more often than you wanted to—say every two-to-four years—either because it was too slow, would not run a new application, or for other reasons. This is a phenomenon of Moore's Law that we all know quite well.
Nearly 60 Years Old; Still Strong
More than 50 years later, we feel the lasting impact and benefits of Moore's Law in many ways.
As transistors in integrated circuits become more efficient, computers become smaller and faster. Chips and transistors are microscopic structures that contain carbon and silicon molecules, which are aligned perfectly to move electricity along the circuit faster. The faster a microchip processes electrical signals, the more efficient a computer becomes. The cost of higher-powered computers has been dropping annually, partly because of lower labor costs and reduced semiconductor prices.
Practically every facet of a high-tech society benefits from Moore's Law in action. Mobile devices, such as smartphones and computer tablets would not work without tiny processors; neither would video games, spreadsheets, accurate weather forecasts, and global positioning systems (GPS).
All Sectors Benefit
Moreover, smaller and faster computers improve transportation, health care, education, and energy production—to name but a few of the industries that have progressed because of the increased power of computer chips.
- Moore's Law states that the number of transistors on a microchip doubles about every two years, though the cost of computers is halved.
- In 1965, Gordon E. Moore, the co-founder of Intel, made this observation that became Moore's Law.
- Another tenet of Moore's Law says that the growth of microprocessors is exponential.
Moore's Law's Impending End
Experts agree that computers should reach the physical limits of Moore's Law at some point in the 2020s. The high temperatures of transistors eventually would make it impossible to create smaller circuits. This is because cooling down the transistors takes more energy than the amount of energy that already passes through the transistors. In a 2007 interview, Moore himself admitted that "...the fact that materials are made of atoms is the fundamental limitation and it's not that far away...We're pushing up against some fairly fundamental limits so one of these days we're going to have to stop making things smaller."
Connected, Empowered Forever?
The vision of an endlessly empowered and interconnected future brings both challenges and benefits. Shrinking transistors have powered advances in computing for more than half a century, but soon engineers and scientists must find other ways to make computers more capable. Instead of physical processes, applications and software may help improve the speed and efficiency of computers. Cloud computing, wireless communication, the Internet of Things (IoT), and quantum physics all may play a role in the future of computer tech innovation.
Despite the growing concerns around privacy and security, the advantages of ever-smarter computing technology can help keep us healthier, safer, and more productive in the long run.
Creating the Impossible?
Perhaps the idea of Moore's Law approaching its natural death is most painfully present at the chip manufacturers themselves; as these companies are saddled with the task of building ever-more-powerful chips against the reality of physical odds. Even Intel is competing with itself and its industry to create what ultimately may not be possible.
In 2012, with its 22-nanometer (nm) processor, Intel was able to boast having the world's smallest and most advanced transistors in a mass-produced product. In 2014, Intel launched an even smaller, more powerful 14nm chip; and today, the company is struggling to bring its 10nm chip to market.
For perspective, one nanometer is one-billionth of a meter, smaller than the wavelength of visible light. The diameter of an atom ranges from about 0.1 to 0.5 nanometers.