Image
A new paper by MIT CSAIL researchers maps the many software-engineering tasks beyond code generation, identifies bottlenecks, and highlights research directions to overcome them. The goal: to let humans focus on high-level design, while routine work is automated (Credits: Alex Shipps/MIT CSAIL, using assets from Shutterstock and Pixabay).
CSAIL article

Imagine a future where artificial intelligence quietly shoulders the drudgery of software development: refactoring tangled code, migrating legacy systems, and hunting down race conditions, so that human engineers can devote themselves to architecture, design, and the genuinely novel problems still beyond a machine’s reach. Recent advances appear to have nudged that future tantalizingly close, but a new paper by researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and several collaborating institutions argues that this potential future reality demands a hard look at present-day challenges. 

Image
The “PhysicsGen” system can multiply a few dozen VR demonstrations into nearly 3,000 simulations per machine for mechanical companions like robotic arms and hands (Credit: Alex Shipps/MIT CSAIL using photos from the researchers).
CSAIL article

When ChatGPT or Gemini gives what seems to be an expert response to your burning questions, you may not realize how much information it relies on to give that reply. Like other popular artificial intelligence (AI) models, these chatbots rely on backbone systems called foundation models that train on billions or even trillions of data points.

Image
Ray and Maria Stata Center exterior
CSAIL article

MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) has announced a new direction for its long-standing FinTech research initiative, now FinTechAI@CSAIL, to highlight the central role artificial intelligence is playing in shaping the future of finance.

Image
Top row, left to right: Matthew Caren, April Qiu Cheng, Arav Karighattam, and Benjamin Lou. Bottom row, left to right: Isabelle Quaye, Albert Qin, Ananthan Sadagopan, and Gianfranco (Franco) Yee (Credits: Photos courtesy of the Hertz Foundation).
CSAIL article

The Hertz Foundation announced that it has awarded fellowships to eight MIT affiliates. The prestigious award provides each recipient with five years of doctoral-level research funding (up to a total of $250,000), which gives them an unusual measure of independence in their graduate work to pursue groundbreaking research.

Image
alt="SketchAgent uses a multimodal language model to turn natural language prompts into sketches in a few seconds. It can doodle on its own or through collaboration, drawing with a human or incorporating text-based input to sketch each part separately (Credits: Alex Shipps/MIT CSAIL, with AI-generated sketches from the researchers)."
CSAIL article

When you’re trying to communicate or understand ideas, words don’t always do the trick. Sometimes the more efficient approach is to do a simple sketch of that concept — for example, diagramming a circuit might help make sense of how the system works.

But what if artificial intelligence could help us explore these visualizations? While these systems are typically proficient at creating realistic paintings and cartoonish drawings, many models fail to capture the essence of sketching: its stroke-by-stroke, iterative process, which helps humans brainstorm and edit how they want to represent their ideas.

Image
PhD student Faraz Faruqi, lead author of a new paper on the project, says that TactStyle could have far-reaching applications extending from home decor and personal accessories to tactile learning tools (Credits: Mike Grimmett/MIT CSAIL).
CSAIL article

Essential for many industries ranging from Hollywood computer-generated imagery to product design, 3D modeling tools often use text or image prompts to dictate different aspects of visual appearance, like color and form. As much as this makes sense as a first point of contact, these systems are still limited in their realism due to their neglect of something central to the human experience: touch.