Enterprise AI Shifts from Hype to Agents, Databricks CEO Ghodsi Reveals

4 min read
Databricks OpenAI partnership

The burgeoning excitement around AI agents in the enterprise, promising significant gains in productivity and revenue, was a central theme in Databricks CEO Ali Ghodsi's recent discussion on CNBC's 'The Exchange.' Ghodsi spoke with McKenzie Sigalos about Databricks' multi-year, $100 million partnership with OpenAI, focused on integrating OpenAI models onto Databricks' platform to build AI applications and agents that leverage corporate data. The conversation extended to the current state of enterprise AI adoption, measuring return on investment, and the broader implications of AI spending within the rapidly evolving technology landscape.

Ghodsi emphasized that enterprises are "really, really excited to get agents actually working inside the enterprise." He clarified that while the buzz around AI has been significant, widespread, scalable success with AI agents is still in its nascent stages. The initial phase of AI adoption saw considerable investment in foundational infrastructure and exploration, sometimes outpacing immediate, measurable returns. However, tangible use cases are beginning to emerge, demonstrating clear value propositions that extend beyond mere experimentation. These agents are designed to automate complex tasks, enhance decision-making, and unlock new operational efficiencies by interacting directly with a company's unique and proprietary data.

The transition from conceptual potential to demonstrable business impact is a critical inflection point for enterprise AI. Ghodsi cited several compelling examples of this shift. Mastercard, for instance, has been leveraging Databricks and OpenAI for all its product onboarding processes, streamlining a historically complex and labor-intensive operation. Similarly, Adidas is deploying AI agents to analyze global sentiment around its shoe designs, enabling the company to "improve those shoes and the designs in a way they couldn't do before with humans." This capability allows for real-time, data-driven product iteration based on vast amounts of unstructured feedback from across the web. Furthermore, Block (formerly Square), the innovative payment company, is now able to completely automate how stores and restaurants install, interact with, and configure their payment systems simply "by just speaking English to it," rather than requiring intricate manual setups. This natural language interface represents a significant leap in user experience and operational simplicity.

Despite these promising early successes, Ghodsi cautioned against over-exuberance, acknowledging that "people got too excited maybe in the last two years," leading to investments that were "a little bit ahead of themselves." He anticipates that a complete, transformative shift across every company on the planet, fundamentally altering business models and operations, will take "a few years" to fully materialize. This measured optimism suggests a recognition that while the potential is immense, the practical implementation and integration of AI into complex enterprise workflows demand time, sustained effort, and a realistic understanding of deployment challenges. The current surge in AI investing, particularly in foundational infrastructure, is a necessary precursor to this broader adoption, laying the groundwork for future scalability and impact.

The impact is indeed happening. Projects are landing.

McKenzie Sigalos raised a pertinent concern regarding a potential "circular spending problem" in the AI ecosystem, where massive investments flow from chip manufacturers like Nvidia to AI developers like OpenAI, only for those funds to cycle back for hardware, potentially centralizing power and creating a single point of failure. Ghodsi addressed this by highlighting Databricks' strategic platform approach, which aims to provide enterprises with flexibility and mitigate such risks. He stated that Databricks is unique in its ability to host "all three major models" – OpenAI, Anthropic, and Google's Gemini models natively. This multi-model strategy is not merely about offering choice; it provides inherent redundancy and resilience for customers. By enabling enterprises to seamlessly switch between or integrate different leading AI models, Databricks ensures that they are not locked into a single vendor or technology stack. This strategic diversification fosters a more competitive and robust AI landscape, protecting enterprises from potential disruptions or escalating costs associated with monopolistic control.

Ghodsi underscored that only a "small portion of the people on the planet are using LLMs right now," indicating vast untapped potential for growth and further adoption. He firmly believes that "there's going to be much more AI usage in the future than we have today," necessitating further infrastructure build-outs globally. These geographically distributed deployments will be crucial for reducing latency and bringing AI capabilities closer to the end-users and data sources. The current focus for Databricks, he concluded, remains laser-sharp on ensuring enterprise customers achieve success with their AI use cases, moving beyond theoretical potential to demonstrable business impact and measurable return on investment. The journey is still in its early chapters, but the path towards widespread, impactful AI integration is becoming clearer.