Image
alt="A software program runs on a monitor at an empty desk (Credit: Pixabay)."
CSAIL article

A particular set of probabilistic inference algorithms common in robotics involve Sequential Monte Carlo methods, also known as “particle filtering,” which approximates using repeated random sampling. (“Particle,” in this context, refers to individual samples.) Traditional particle filtering struggles with providing accurate results on complex distributions, giving rise to advanced algorithms such as hybrid particle filtering.

Image
The new compiler, called SySTeC, can optimize computations by automatically taking advantage of both sparsity and symmetry in tensors (Credits: iStock).
CSAIL article

The neural network artificial intelligence models used in applications like medical image processing and speech recognition perform operations on hugely complex data structures that require an enormous amount of computation to process. This is one reason deep-learning models consume so much energy.