MIT CSAIL unsealed a special time capsule from 1999 after a self-taught programmer Belgium solved a puzzle devised by MIT professor and famed cryptographer Ron Rivest.
A new algorithm developed by MIT researchers takes cues from panoramic photography to merge massive, diverse cell datasets into a single source that can be used for medical and biological studies.
A novel technique developed by MIT researchers rethinks hardware data compression to free up more memory used by computers and mobile devices, allowing them to run faster and perform more tasks simultaneously.
A new learning system developed by MIT researchers improves robots’ abilities to mold materials into target shapes and make predictions about interacting with solid objects and liquids. The system, known as a learning-based particle simulator, could give industrial robots a more refined touch — and it may have fun applications in personal robotics, such as modelling clay shapes or rolling sticky rice for sushi.
The American Academy of Arts and Sciences (AAAS) announced that MIT professor David Karger was among their new 2019 members. The new class of more than 200 members recognizes the outstanding achievements of individuals in academia, the arts, business, government, and public affairs.
MIT CSAIL researchers have devised a new way to find such patterns using machine learning.
Their system uses a neural network to automatically predict if a specific element will appear frequently in a data stream. If it does, it’s placed in a separate bucket of so-called “heavy hitters” to focus on; if it doesn’t, it’s handled via hashing.
MIT is celebrating the launch of the new $1 billion MIT Stephen A. Schwarzman College of Computing. To help commemorate the event, here’s a list of 25 ways in which MIT has already transformed the world of computing technology.
Today’s data centers eat up and waste a good amount of energy responding to user requests as fast as possible, with only a few microseconds delay. A new system by MIT researchers improves the efficiency of high-speed operations by better assigning time-sensitive data processing across central processing unit (CPU) cores and ensuring hardware runs productively.