Sakana, a Tokyo-based AI startup, has introduced a new AI model architecture called Continuous Thought Machines (CTMs), designed to function more like human brains. Unlike traditional Transformer models, CTMs unfold computations over time, allowing each neuron to make activation decisions based on short-term memory, enhancing
•8m read time• From venturebeat.com
Table of contents
How CTMs differ from Transformer-based LLMsUsing variable, custom timelines to provide more intelligenceEarly results: how CTMs compare to Transformer models on key benchmarks and tasksWhat’s needed before CTMs are ready for enterprise and commercial deployment?What enterprise AI leaders should know about CTMsSakana’s checkered AI research historyBetting on evolutionary mechanismsSort: