Singapore-MIT Alliance for Research and Technology’s (SMART) Mens, Manus & Machina (M3S) interdisciplinary research group, and National University of Singapore (NUS), alongside collaborators from Massachusetts Institute of Technology (MIT) and Nanyang Technological University (NTU Singapore), have developed an AI control system that enables soft robotic arms to learn a wide repertoire of motions and tasks once, then adjust to new scenarios on the fly without needing retraining or sacrificing functionality. This breakthrough brings soft robotics closer to human-like adaptability for real-world applications, such as in assistive robotics, rehabilitation robots, and wearable or medical soft robots, by making them more intelligent, versatile and safe.
Generative artificial intelligence models have left such an indelible impact on digital content creation that it’s getting harder to recall what the internet was like before it. You can call on these AI tools for clever projects such as videos and photos — but their flair for the creative hasn’t quite crossed over into the physical world just yet.
MIT researchers have developed a new method for designing 3D structures that can be transformed from a flat configuration into their curved, fully formed shape with only a single pull of a string.
Computer-aided design (CAD) systems are tried-and-true tools used to design many of the physical objects we use each day. But CAD software requires extensive expertise to master, and many tools incorporate such a high level of detail they don’t lend themselves to brainstorming or rapid prototyping.
Imagine having a continuum soft robotic arm bend around a bunch of grapes or broccoli, adjusting its grip in real time as it lifts the object. Unlike traditional rigid robots that generally aim to avoid contact with the environment as much as possible and stay far away from humans for safety reasons, this arm senses subtle forces, stretching and flexing in ways that mimic more of the compliance of a human hand. Its every motion is calculated to avoid excessive force while achieving the task efficiently. In MIT Computer Science and Artificial Intelligence Laboratory (CSAIL) and Laboratory for Information and Decisions Systems (LIDS) labs, these seemingly simple movements are the culmination of complex mathematics, careful engineering, and a vision for robots that can safely interact with humans and delicate objects.
Pulkit Agrawal, MIT EECS Associate Professor and CSAIL principal investigator, has received the Toshio Fukuda Young Professional Award from the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) for his work in “robot learning, self-supervised and sim-to-real policy learning, agile locomotion, and dexterous manipulation,” according to the organization.
Chatbots like ChatGPT and Claude have experienced a meteoric rise in usage over the past three years because they can help you with a wide range of tasks. Whether you’re writing Shakespearean sonnets, debugging code, or need an answer to an obscure trivia question, artificial intelligence (AI) systems seem to have you covered. The source of this versatility? Billions or even trillions of textual data points across the Internet.
A global cohort of eight scientists and engineers working in a variety of disciplines were named Schmidt Polymaths and will each receive up to $2.5 million over five years to pursue research in new disciplines or using new methodologies, Schmidt Sciences announced today.
The artificial intelligence models that turn text into images are also useful for generating new materials. Over the last few years, generative materials models from companies like Google, Microsoft, and Meta have drawn on their training data to help researchers design tens of millions of new materials.