Diffusion models like OpenAI’s DALL-E are becoming increasingly useful in helping brainstorm new designs. Humans can prompt these systems to generate an image, create a video, or refine a blueprint, and come back with ideas they hadn’t considered before.
A human clearing junk out of an attic can often guess the contents of a box simply by picking it up and giving it a shake, without the need to see what’s inside. Researchers from MIT, Amazon Robotics, and the University of British Columbia have taught robots to do something similar.
When the Venice Biennale’s 19th International Architecture Exhibition launches on May 10, its guiding theme will be applying nimble, flexible intelligence to a demanding world — an ongoing focus of its curator, MIT faculty member Carlo Ratti.
Fish are masters of coordinated motion. Schools of fish have no leader, yet individuals manage to stay in formation, avoid collisions, and respond with liquid flexibility to changes in their environment. Reproducing this combination of robustness and flexibility has been a long-standing challenge for human engineered systems like robots. Now, using virtual reality for freely-moving fish, a research team based in Konstanz has taken an important step towards that goal.
An estimated 20% of every dollar spent on manufacturing is wasted, totaling up to $8 trillion a year, more than the entire annual budget for the U.S. federal government. While industries like healthcare and finance have been rapidly transformed by digital technologies, manufacturing has relied on traditional processes that lead to costly errors, product delays, and an inefficient use of engineers’ time.
This week the National Academy of Engineering (NAE) elected Tomás Lozano-Pérez, MIT School of Engineering Professor in Teaching Excellence and CSAIL principal investigator, as a member for his work in robot motion planning and molecular design.