Swarms of simple, interacting robots have the potential to unlock stealthy abilities for accomplishing complex tasks. Getting these robots to achieve a true-hive like mind of coordination, though, has still proved to be a hurdle.
A team from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) has brought us closer to this chameleon reality, by way of a new system that uses reprogrammable ink to let objects change colors when exposed to ultraviolet (UV) and visible light sources.
From 3-D printing to 3-D knitting, CSAIL researchers venture to streamline design technology through AI. The computer-aided design tool allows for customization of patterns based on user preferences with minimal programming knowledge needed.
Robots that have been programmed to see or feel can’t use these signals quite as interchangeably. To better bridge this sensory gap, researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have come up with a predictive artificial intelligence (AI) that can learn to see by touching, and learn to feel by seeing.
Wearing a sensor-packed glove while handling a variety of objects, MIT CSAIL researchers have compiled a massive dataset that enables an AI system to recognize objects through touch alone. The information could be leveraged to help robots identify and manipulate objects, and may aid in prosthetics design.
Technology as a vector for positive change | Technology for a better world
CSAIL recently established the TEDxMIT series. The TEDxMIT events will feature talks about important and impactful ideas by members of the broader MIT community.
This event is organized by Daniela Rus and John Werner, in collaboration with a team of undergraduate students led by Stephanie Fu and Rucha Keklar.
MIT CSAIL unsealed a special time capsule from 1999 after a self-taught programmer Belgium solved a puzzle devised by MIT professor and famed cryptographer Ron Rivest.