The era of broad, undifferentiated enthusiasm for artificial intelligence, particularly Large Language Models (LLMs), is poised to give way to a more selective and demanding landscape, according to Rick Sherlund, Senior Advisor at Wedbush Investment Banking. Speaking on CNBC's "Fast Money," Sherlund underscored a pivotal shift from consumer-facing AI novelties to deep enterprise integration, a transition he predicts will place unprecedented strain on existing systems and reshape the tech trade significantly. This evolution, he argues, will necessitate strategic M&A, spur a robust IPO market for scaled private AI firms, and ignite an insatiable demand for data center infrastructure.
Sherlund, a Wall Street veteran, joined Melissa Lee and Scott Wapner on CNBC's "Fast Money" to dissect the future trajectory of AI and its profound implications for the tech sector. His commentary painted a picture of an industry moving beyond the initial hype, toward a phase of practical, business-critical application where the rubber meets the road for LLMs.
This year, Sherlund observed, has been characterized by a "monolithic trade," largely driven by the "Magnificent 7" tech giants, which now constitute a substantial portion of the S&P 500. However, he anticipates a significant broadening of this trade in 2026. This expansion will encompass a wider universe of Software-as-a-Service (SaaS) companies, many of which have been "out of favor" due to market concerns regarding the architectural flexibility required to accommodate AI and potential risks to their existing business models. As these companies demonstrate progress in integrating AI and adapting their models, they are expected to regain investor interest and drive a more diversified market rally.
The core of Sherlund's analysis centers on the deepening integration of AI into enterprise workflows. While AI has previously served as a "novelty for consumers" and a productivity booster for knowledge workers through chats and summarization, it is now "finally becoming part of business process and workflows." This marks a critical inflection point, moving AI from an auxiliary tool to an embedded component of how companies conduct their core operations.
This shift involves the sophisticated use of "agents or agentics" collaborating to execute complex workflows. Such processes are far more demanding than simple chatbot queries. A single call to an LLM for a direct answer is one instance of inference, but a complex workflow could entail "10 or 50 trips back to the LLM" as agents interact, reason, and refine tasks. This exponential increase in inference calls will profoundly "stress" enterprise systems.
The implications for infrastructure are staggering. The sheer volume of inference required for these integrated business processes will create "enormous demand for data centers." Sherlund expressed no concern about the data center trade, confidently stating, "we're going to have so much more demand than we have supply of data centers." This anticipated surge in demand is constrained by capital, GPU availability, and memory, pointing to a bottleneck that will drive significant investment and innovation in data center technologies.
Furthermore, Sherlund delved into the competitive dynamics of the LLM market itself. He characterizes it as a "very competitive business," with new entrants and open-source models (from players like China, Meta, and Nvidia) constantly shifting the landscape. Despite this fierce competition at the foundational model layer, Sherlund does not believe the leaders in the LLM market will make their primary revenue solely from these models. Instead, he posits they will "go up the stack," emulating historical strategies seen with companies like Microsoft leveraging Windows or Oracle with databases, by embedding LLM capabilities into higher-value applications and services. This vertical integration will be key to long-term profitability and market dominance.
Looking ahead, Sherlund foresees a bustling M&A environment and a "very robust IPO market" for private AI companies that have achieved meaningful scale and traction. Enterprise companies, needing to rapidly infuse AI into their architectures, will increasingly turn to acquisitions as an efficient pathway to integrate advanced AI capabilities. This strategic imperative will drive significant deal activity, fostering a vibrant ecosystem of innovation and consolidation. The convergence of these trends—broadening AI adoption, exploding inference demand, data center expansion, and strategic market plays—underscores a transformative period for the global tech economy, moving AI from an exciting promise to an indispensable operational reality.



