Written by Audrey Woods
With the recent excitement around emerging and buzzworthy technologies such as generative AI, it can feel like we live in an age of unprecedented advancement. Aided by social media and our interconnected world, the computer science news cycle moves fast, which can give average consumers the impression that our digital tools are improving just as rapidly.
However, the counterintuitive truth is that the pace of computer progress is actually slowing down. Due to the end of Moore’s Law and explosion in chip production costs, computer science as a field has a reckoning ahead of it. Programmers can no longer assume that reliably faster chips will give them the improvements needed for larger and more complicated tasks. They will have to work harder to innovate, seeking out new ideas and methods to maintain the momentum of progress we’ve enjoyed for the past 50 years.
Working to discover some of those new methods, Research Scientist Neil Thompson is the Director of the MIT FutureTech Research Project and co-leads the Supertech Research Group at CSAIL along with Professor Charles Leiserson and Research Scientist Tao Schardl. With a PhD in business and public policy and a master’s in computer science and statistics from UC Berkely, along with a master’s in economics from the London School of Economics, Dr. Thompson brings both a technical and commercial background to the question of what computer improvement will look like going forward. He has advised businesses and governments on the future of Moore’s Law and machine learning, has been on National Academies panels on transformational technologies and scientific reliability, and has previously lectured at the MIT Sloan School of Management and Harvard University.
With extensive experience in research, teaching, and recognizing economic trends, Dr. Thompson provides some insight into what this changing rate of computer progress will mean, new methods to seek out improvements, and what everyday users can expect.
Will Computers Still Be a General Purpose Technology?
It’s easy to understand why, at this point in time, computers are considered a “General Purpose Technology,” or what economists refer to as a GPT. Indeed, computers affect nearly every area of modern life, from finances to medicine to entertainment. The economic concept of a GPT is important because, as Dr. Thompson explained in a previous conversation with CSAIL Alliances, “when innovation happens in that technology, it spills over into all the adjacent areas. As we get better algorithms for compressing data, Netflix gets better. Or as we improve what computer vision can do, our cars can keep on the road a bit better. All of those things are examples of this innovation in one place spilling out into all of these other areas.”
This matters not just because of the butterfly effect of computer progress—such as the progress Moore’s Law has facilitated for the past half-century—but also because of the self-reinforcing nature of this cycle. Improvements are expensive but, thanks to the ubiquitous nature of the technology, there’s money to go around. Dr. Thompson explains, “that additional market size finances the next round of improvement.” This positive feedback loop has been enormously beneficial to society, fueling an era of incredible development and growth. By some estimates, a third of all productivity increases in the US since 1974 have come from information technology, meaning computers are one of the largest contributors to national prosperity.
The bad news is that Dr. Thompson’s research has revealed this cycle is slowing down. With the growing expense of CPU enhancement, economic incentives will begin to push companies and users toward more specialized systems that provide gains in their niche but not to the broader computer science field. While this might not seem critical, such findings have broad implications. For one, it means that the computer progress of the future will have more targeted benefit to specific companies and industries rather than the “rising tide” of before. The role of the CTO will also become more technically rigorous as corporations are driven to make challenging hardware and software decisions to squeeze out as much efficiency as possible. And finally, understanding that software “portability” is no longer an option, programmers will have to get creative about improvement.
Where can those programmers look for gains in a post-Moore’s Law era? Working with several other CSAIL scholars, Dr. Thompson published a 2020 Science paper titled “There’s plenty of room at the Top” on exactly this topic. Acknowledging that silicon-based transistors can’t get much smaller, the authors argued that there are many options left to explore at the “top” of the computing stack, namely software, algorithms, and hardware architecture. For example, programmers have historically made decisions that prioritized ease of writing code over processing speed, trusting that faster technology would make up for less efficient programs. But switching to more efficient languages like C instead of Python can, in some cases, speed up execution time by up to 47x. In other words, there are still ample improvements to be found in rethinking traditional wisdom and applying software performance engineering.
Next Steps: CHIPS Act & Beyond
One topic Dr. Thompson is actively working to raise awareness about is how this slowdown of computer progress threatens the global advantage the US has enjoyed since computers first came onto the scene. “A huge proportion of the algorithms that have pushed computing forward have come out in the United States,” Dr. Thompson says. “That overflows into all these other areas of society and gives them benefits. We’re actually really losing that lead.” A recent report put out by Dr. Thompson’s group shows that China has already closed the computing gap with the US in many areas and nearly 80% of American leaders in the computer science field believe that Chinese tech giants are improving their capabilities faster than US companies are. In their number of supercomputers, their work on advanced algorithms, and the number of PhDs graduating in computer science, China is at or near parity with the US and is projected to surpass it in the coming years.
To address this, Dr. Thompson recommends active investment, particularly by the US government, in “trying to figure out what the next era of computing looks like.” He calls the CHIPS act of 2022—which allocated $52.7 billion for American semiconductor research, development, manufacturing, and workforce development—“good first steps.” But Dr. Thompson, along with the other authors of the report, suggest other avenues that should be considered going forward. For example, democratizing access to technologies like supercomputers could lead to more innovation and economic experimentation. Also, expanding education programs to increase the potential reservoir of computer science talent could also lead to new and groundbreaking ideas. Finally, while many great things can come from the business sector, Dr. Thompson and his team showed in a recent publication that industry is rapidly gaining control over key areas such as AI, which risks the technology developing in ways that do not suit public interest or prioritize national competitiveness. The authors write, “the goal should not be that academia does a particular share of research. Instead, the goal should be to ensure the presence of sufficient capabilities to help audit or monitor industry models or to produce alternative models designed with the public interest in mind.”
Ultimately, there’s no way to know what will define the next generation of computers. While there are many novel ideas being trialed such as quantum computers, spintronics, neuromorphics, optical computing, and optical interconnect fabrics, it’s difficult to predict which, if any, of these technologies could replace the economic and technical engine that was Moore’s Law. But Dr. Thompson’s group has a recommendation for that as well in an article titled “Unleash the Unexpected.” Pointing out that the accelerometer chip, which has become a critical component of smart devices everywhere, was invented long before its true impact became apparent, Dr. Thompson and his colleagues argue that breakthroughs aren’t predictable. Transformational ideas can come from unforeseen places and might not be immediately recognized. This means that time, investment, brainpower, and patience are needed to fully explore the range of options that could fuel the digital economy going forward.
With his focus on finding, studying, and highlighting such options, Dr. Thompson is doing his part to expand the horizon of improved computer performance.
Learn more about Dr. Thompson on his website or CSAIL page.