Dr. Michael Timothy Bennett, a computer scientist known for his provocative paper "What the F*** is Artificial Intelligence," recently sat down with Enzo Blindow of Machine Learning Street Talk at the Diverse Intelligences Summer Institute. Their discussion delved deep into the fundamental nature of intelligence, challenging Silicon Valley’s prevailing "just scale it up" mentality and advocating for a biologically inspired approach to artificial intelligence. Bennett posits that true intelligence is not merely about accumulating parameters or data, but rather about "adaptation with limited resources," a concise definition he attributes to researcher Pei Wang.
Bennett argues that current AI systems, despite their impressive capabilities, fall short when compared to biological systems in terms of efficiency. He highlights that "biological systems with a tiny fraction of the energy and learning data can do so much more" than their artificial counterparts. This stark contrast underscores a fundamental flaw in the prevailing AI paradigm, which often prioritizes sheer computational power and data volume over adaptive efficiency.
This leads to Bennett's "Law of the Stack," a concept suggesting that intelligence is maximized when control is delegated to lower levels of abstraction within a system. He provocatively asserts that "artificial intelligence is like a brittle bureaucracy that only adapts top down." Biological systems, by contrast, exhibit a decentralized, adaptable structure, allowing for robust and efficient responses to novel challenges.
Bennett extends this critique to the very notion of computational dualism, where software is often seen as separate from its underlying hardware and environment. He uses the striking analogy that modern AI has "just replaced the pineal gland with a Turing machine," suggesting a continued reliance on a dualistic view that ignores the holistic, embodied nature of intelligence. True intelligence, he contends, must learn causal relationships and operate within the constraints of a physical world, rather than existing purely as abstract software.
The pursuit of Artificial General Intelligence (AGI) is further complicated by the limitations of current benchmarks. While benchmarks offer a convenient way to measure progress, they can be easily "solved" by systems that lack genuine understanding or adaptive capacity. Bennett suggests that an over-reliance on such metrics can obscure the deeper challenges of creating truly intelligent, versatile systems.
Ultimately, Bennett's commentary serves as a crucial reminder for founders, VCs, and AI professionals: the path to advanced intelligence may not lie in simply amplifying existing methods. Instead, it demands a fundamental re-evaluation of what intelligence truly is, drawing inspiration from the elegant, efficient, and adaptable architectures found in nature.

