Nano Banana and the emergence of a 100-page prompt, as detailed in a recent "Mixture of Experts" podcast featuring Tim Hwang, Aaron Baughman, Chris Hay, and Lauren McHugh Olende, underscores a critical inflection point in AI development. While often seen as a trivial input, KPMG’s use of such an extensive prompt for their agentic TaxBot, designed to generate 25-page advisory opinions, highlights the escalating complexity and specificity required to align large language models with intricate business processes. This meticulous crafting of instructions reveals that prompt engineering is far from a dying art; it is evolving into a sophisticated discipline.
Aaron Baughman, an IBM Fellow, shared his own experience with a 40-page prompt, emphasizing its effectiveness in summarizing complex manuals. Chris Hay, a Distinguished Engineer, concurred, stating that he is "not surprised" by such lengthy prompts. He argues that if a model lacks domain knowledge, embedding that information directly into the context window is a pragmatic solution, preferring it to "roll[ing] the dice at RAG." Lauren McHugh Olende, however, offered a sharper critique: "If a product requires a 100-page user manual to work, at best it's poorly designed, at worst it's broken." This succinctly captures the tension between brute-force contextualization and the pursuit of more elegant, perhaps fine-tuned, model architectures.