Specialized Programming Languages and Compilers with Professor Saman Amarasinghe

Written By: Audrey Woods

The end of Moore’s Law has significant implications for computer scientists, who have for a long time depended on the steadily increasing computing power of the last 50 years. In 1965, Gordon Moore observed that the number of transistors in an integrated circuit doubles about every two years, making computers faster, smaller, and more efficient on a reasonably predictable schedule. But now there is general acknowledgement that we’ve reached the physical limitations of silicon-based CPUs, which means that programmers and engineers can no longer just wait for performance issues to naturally get better. Now, researchers must come up with new ways to maximize efficiency and speed in current computer systems.

MIT Professor Saman Amarasinghe is one of the scientists working on this problem, with a focus on using specialized programming languages and compilers to maximize performance on modern computing platforms. In his conversation with CSAIL Alliances, Professor Amarasinghe explains how the “Golden Age of Moore’s Law” made programmers lax when it came to program design. He says that developers “became inefficient because there was no need to be efficient, no need to be concise.” However, now that Moore’s Law has ended, “we are hitting a point where there's not much fat left… And then machine learning suddenly said, ‘we need a lot more performance,’ performance that right now we can't provide.” This bottleneck has led to a “renaissance” of performance engineering, where scientists are combining old techniques and thinking with new computing systems to squeeze out as much performance as possible.

As leader of the Commit compiler group at CSAIL, Professor Saman Amarasinghe is working hard to support the pace of innovation we’ve enjoyed for the past half century and ensure that the end of Moore’s Law doesn’t mean the end of computing progress as we know it.

A Solution: Domain Specific Languages

A large portion of Professor Amarasinghe’s research is focused on the creation of domain specific languages (DSLs), or programming languages designed to maximize productivity in a given niche. He explains, “if you look at something like C, it's a general-purpose language. But if you look at how people program, there are domains like graphics or computation biology or earth simulation. [Programmers] are doing some very specific tasks in those fields and they also know certain things you can do to get good performance. So what DSLs try to do is capture those tasks natively and do those things to run tasks faster.”

One example he brings up is Halide, a now widely adopted language and compiler developed at CSAIL to help graphics programmers. Created with Professor Amarasinghe, Professor Frédo Durand, and Assistant Professor Jonathan Ragan-Kelley, Halide was intended to accelerate common graphics tasks using repeating patterns and characteristics. Before Halide, a programmer might spend weeks and even months creating and optimizing a new feature for, say, Photoshop. Using Halide, these programmers can do the same task in hours.

But DSLs aren’t a cure-all, Professor Amarasinghe says, explaining how “we need to capture what's important for programmers because [DSLs] can't do everything. If you put too many things, it becomes harder for compiles to work because it becomes too complicated. But if it doesn't do the important things for the programmer, they can't use it either. So we need to find the minimal set of rules that we can incorporate for programmers to use, and then that will become a useful domain specific language.”

Another challenge in DSLs is the lack of support. Without the size and ubiquity of larger languages like Python and C++, developers have to approach issues like debugging on their own, something Professor Amarasinghe has previously called “the Achilles heel for DSLs.” However, recent work from Professor Amarasinghe’s group offers one potential solution in DX2, a tool for adding debugging capabilities to any DSL. This new tool could have widespread application, even supporting popular DSLs like Halide, which, despite its widespread use, still doesn’t have a debugger.

Generally, this research aims to change the landscape of programming, bringing more options to software developers and even non-experts. Professor Amarasinghe says that DSLs like Halide make it possible for more people to access top-level performance without years of experience and effort. He also brings up how the addition of large language models into this process could make things more interesting still. Professor Amarasinghe imagines a future where someone could craft a description of a program and then have a tool which writes an efficient, compiled version of that program. He says the technology is “not there yet, but there are early indications that direction might be possible.”

The Importance of Efficiency

When asked what problem he believes isn’t getting enough attention right now, Professor Amarasinghe answers that, while there is plenty of focus on reducing computing costs and maximizing performance with available computation, there isn’t enough acknowledgement of the gains that could be made on small devices and everyday applications. He says, “thinking about global warming, your iPhone doesn't seem like much, but when there are a billion of these devices, it adds up.”

Right now, the bulk of economic attention is on maximizing the efficiency of large programs like machine learning or lowering costs associated with services like AWS. This means that to get traction, Professor Amarasinghe and his group are targeting their research in that direction. But he hopes for a broader awareness and interest in addressing the effect of everyday devices, saying, “if a billion people reduce the energy consumption of their devices by 50%, that's going to have a huge impact in the world.”

The Future of Programming

These new advances bring up the interesting question of what it will mean to be a programmer going forward. Professor Amarasinghe acknowledges that the individuals working with computers “will still have to understand how to instruct the computer. They can't be just anybody. So there will be certain level of knowledge, understanding, and education to do that, but they might be educated at a much different level.”

Professor Amarasinghe sees this change as exciting, furthering the “democratization” of computing that he’s already witnessed elsewhere, such as in his faculty director role in MIT’s Global Startup Labs. With the advent of smartphones and personal devices, he explains how the opportunities to try out a business idea or startup have exploded, allowing people to do “amazing things with very little money.” This has facilitated widespread innovation and given entrepreneurs all over the world the chance to try their hand at monetizing apps, programs, and tools, perhaps balancing out whatever negative impact the end of Moore’s Law might have. Hopefully, Professor Amarasinghe’s work on DSLs, compilers, and other high-performance computing tools will continue to support such innovation and fuel the future of computer technology.