During a meeting of class 6.C40/24.C40 (Ethics of Computing), Professor Armando Solar-Lezama poses the same impossible question to his students that he often asks himself in the research he leads with the Computer Assisted Programming Group at MIT:
Imagine you’re a chef with a highly sought-after recipe. You write your top-secret instructions in a journal to ensure you remember them, but its location within the book is evident from the folds and tears on the edges of that often-referenced page.
This past month Martin Rinard, MIT professor in the Electrical Engineering and Computer Science Department (EECS) and CSAIL principal investigator, received the 2025 Outstanding Research Award from the ACM Special Interest Group on Software Engineering (SIGSOFT). The organization awarded him for his “fundamental contributions in pioneering the new fields of program repair and approximate computing.”
Sara Beery came to MIT as an assistant professor in MIT’s Department of Electrical Engineering and Computer Science (EECS) eager to focus on ecological challenges. She has fashioned her research career around the opportunity to apply her expertise in computer vision, machine learning, and data science to tackle real-world issues in conservation and sustainability. Beery was drawn to the Institute’s commitment to “computing for the planet,” and set out to bring her methods to global-scale environmental and biodiversity monitoring.
Should you grab your umbrella before you walk out the door? Checking the weather forecast beforehand will only be helpful if that forecast is accurate.
A particular set of probabilistic inference algorithms common in robotics involve Sequential Monte Carlo methods, also known as “particle filtering,” which approximates using repeated random sampling. (“Particle,” in this context, refers to individual samples.) Traditional particle filtering struggles with providing accurate results on complex distributions, giving rise to advanced algorithms such as hybrid particle filtering.
The neural network artificial intelligence models used in applications like medical image processing and speech recognition perform operations on hugely complex data structures that require an enormous amount of computation to process. This is one reason deep-learning models consume so much energy.
From crafting complex code to revolutionizing the hiring process, generative artificial intelligence is reshaping industries faster than ever before — pushing the boundaries of creativity, productivity, and collaboration across countless domains.
If you’ve watched cartoons like Tom and Jerry, you’ll recognize a common theme: An elusive target avoids his formidable adversary. This game of “cat-and-mouse” — whether literal or otherwise — involves pursuing something that ever-so-narrowly escapes you at each try.