This week CSAIL announced a gift from JPMorgan Chase that will enable important new breakthroughs in artificial intelligence research at MIT’s largest interdepartmental lab.
A computation has two main constraints: the amount of memory a computation requires and how long it takes to do that calculation. If a task requires a certain number of steps, at worst the computer will need to access its memory for each one, meaning it'll require the same number of memory slots.
While early language models could only process text, contemporary large language models now perform highly diverse tasks on different types of data. For instance, LLMs can understand many languages, generate computer code, solve math problems, or answer questions about images and audio.