Recently a team led by researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) has been exploring whether self-driving cars can be programmed to classify the social personalities of other drivers, so that they can better predict what different cars will do — and, therefore, be able to drive more safely among them.
A few years ago, the idea of tricking a computer vision system by subtly altering pixels in an image or hacking a street sign seemed like more of a hypothetical threat than anything to seriously worry about.
From 3-D printing to 3-D knitting, CSAIL researchers venture to streamline design technology through AI. The computer-aided design tool allows for customization of patterns based on user preferences with minimal programming knowledge needed.
Robots that have been programmed to see or feel can’t use these signals quite as interchangeably. To better bridge this sensory gap, researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have come up with a predictive artificial intelligence (AI) that can learn to see by touching, and learn to feel by seeing.
Wearing a sensor-packed glove while handling a variety of objects, MIT CSAIL researchers have compiled a massive dataset that enables an AI system to recognize objects through touch alone. The information could be leveraged to help robots identify and manipulate objects, and may aid in prosthetics design.
Technology as a vector for positive change | Technology for a better world
CSAIL recently established the TEDxMIT series. The TEDxMIT events will feature talks about important and impactful ideas by members of the broader MIT community.
This event is organized by Daniela Rus and John Werner, in collaboration with a team of undergraduate students led by Stephanie Fu and Rucha Keklar.
MIT CSAIL unsealed a special time capsule from 1999 after a self-taught programmer Belgium solved a puzzle devised by MIT professor and famed cryptographer Ron Rivest.