WRITTEN BY: Layla Glatman
Research scientist Neil Thompson of MIT CSAIL is the director of the FutureTech Research Project at CSAIL and a principal investigator at MIT’s Initiative on the Digital Economy. His primary areas of research include tools and innovation, computer performance, executing on innovation and strategy, and patenting and licensing. Previously, Thompson was an Assistant Professor of Innovation and Strategy. At the Laboratory for Innovation Science at Harvard, Thompson was a Visiting Professor. Prior to his ventures in academia, Thompson worked at various organizations, including Bain and Company, The United Nations, Lawrence Livermore National Laboratories, Canadian Parliament, and the World Bank. Earlier, Neil Thompson earned his undergraduate degree in Physics and International Development, along with a masters in Economics, from the London School of Economics. At Berkeley, Thompson earned Masters degrees in Computer Science and Statistics, where he also obtained a PhD in Business and Public Policy. In the most recent episode of the CSAIL Alliances podcast series, Thompson joins as a guest and discusses the potential forthcoming end of Moore’s law.
With computer hardware consistently and rapidly improving within recent decades, there have been countless benefits improving daily life and across many areas impacting society.
This description of the fast improvement occurring within all sorts of computer hardware is known as Moore’s law. “The reason that computers have gotten so good, so fast, is that we’ve done a lot of miniaturizing of the components of them,” says Thompson. “When you miniaturize these transistors, you can fit more on each chip and they produce less heat, which is important because it allows chips to run faster and faster.”
But what happens when these improvements begin slowing down? This occurrence would be known as the end of Moore’s law, which Thompson describes as the end of the trend of miniaturization and exploring its potential. “We are coming to this point where as wires get down to just atoms in diameter, things get more complicated,” Thompson explains. “And so, it gets much harder to build chips at that point, and these things that were allowing us to make enormous amounts of progress are slowing.”
While miniaturization allows chips to run faster, there are other variables that can work to improve hardware performance. Areas where progress can be made include improving the design of the circuits of chips and increasing efficiency within operating systems and applications. Thompson provides the example of programming with Python being less efficient for computers than using C or Fortran but being much easier for users. “I think we’re likely to take back some of that,” Thompson shares. “We’re going to look at codes really carefully and find all the different ways that we can make it more efficient and speed it up.”
Another important concept in innovation economics are GPTs, which stands for general purpose technologies. GPTs are technologies, such as computers, that can be applied within numerous areas. For example, as computers got faster it became possible to use them for video calls like zoom. From an economic standpoint, GPTs are areas where innovations happening within a technology “spills over” to benefit adjacent areas. Computers, and the way they have improved over many decades, are the classic example of a GPT.
“There’s a self-reinforcing cycle,” shares Thompson. “As computers improve, people buy more of them, and that additional market size finances the following rounds of improvement for the chips.” Currently, this cycle highly depends on consistent improvements in processor technology through investments, but as the GPT cycle that fuels these advancements begins slowing down, society may get less innovation and generate less prosperity.
Thompson reveals that more changes could occur as we near the end of Moore’s law, such as the role of CTOs shifting gears. “I think it’s going to become a more technically rigorous job, in the sense that it’s going to require understanding more details about potential custom algorithms that are being used and the hardware that will underlie them,” says Thompson.
While similar CPUs are currently being used across many computers, Thompson believes that designing specialized chips for certain areas, such as deep learning or machine learning, will steadily rise in popularity as people stop relying on general improvements in computer hardware.
As a slowdown in computing progress begins, less advancements and benefits could occur as a result. “One of the things about computing over the last decades is that it’s been a rising tide that lifts all boats and we are really not going to be in that world anymore,” says Thompson.
That’s why Thompson and his fellow CSAIL researchers are investigating what will drive computer performance after Moore’s law.