Llion Jones, a co-inventor of the ubiquitous Transformer architecture, contends that the artificial intelligence industry is ensnared in a "local minimum" of its own making. This provocative argument, delivered alongside Sakana AI research scientist Luke Darlow, challenges the prevailing dogma that more scale and iterative tweaks to existing models will unlock genuine intelligence. Instead, Sakana AI proposes a biologically inspired paradigm shift with their Continuous Thought Machines (CTM).
Jones and Darlow spoke with the Machine Learning Street Talk podcast about the industry's "success capture," a phenomenon where the remarkable efficacy of a dominant technology, like the Transformer, inadvertently stifles fundamental innovation. Llion Jones, having spent years deeply entrenched in Transformer research, made the deliberate decision to step away. "I'm going to drastically reduce the amounts of research that I'm doing specifically on the Transformer because of the feeling that I have that it's an oversaturated space," he stated, highlighting a personal conviction that the path to true intelligence lies elsewhere.
