Automotive service centers face mounting pressure. Rising call volumes, the complexity of electric vehicles, and the sheer data deluge from connected cars overwhelm human agents. Mazda, grappling with hundreds of new service information documents annually, needed a faster way to equip its technical service teams.
The automaker turned to the Databricks Lakehouse, transforming its legacy systems into a unified platform for data, analytics, and AI. This move enabled a swift development cycle for GenAI applications in technical service operations.
From Ad-Hoc Testing to Governed AI
A small Mazda data science team, consisting of just two members, achieved a functional GenAI pilot in roughly eight weeks. The project prioritized nimbleness, minimizing infrastructure overhead by utilizing Databricks' integrated capabilities for data pipelines, vector search indexing, and model deployment.
Early development focused on a Retrieval Augmented Generation (RAG) design, connecting a large language model (LLM) with Mazda's proprietary technical documents. This approach is crucial for RAG for technical service, ensuring AI responses are grounded in factual, up-to-date information.
A key architectural decision was allowing the frontend interface and the AI agent to share code and tooling. This ensures vehicle context—including service history, open recalls, and warranty status—is pre-loaded and injected into the system prompt, providing immediate interactivity and eliminating redundant tool calls.
Unity Catalog for Unified Governance
The entire solution is built on the Databricks Lakehouse, with Databricks Lakehouse GenAI capabilities underpinning its functionality. Unity Catalog provides governed access to critical data, while also hosting embeddings, vector search indexes, and model serving within a single environment.
Precise corpus filtering, a critical step for accurate retrieval, is managed via Unity Catalog user-defined functions. These functions map vehicle identification numbers (VINs) or diagnostic trouble codes to specific document subsets, drastically improving relevance.
This ensures that semantic searches are constrained only to documents applicable to the vehicle in question, a significant improvement over naive full-corpus searches. Hosting this logic within Unity Catalog allows it to live alongside the data it governs and be updated independently of agent deployments.
Accelerating Development with MLflow
Mazda transitioned from informal, feedback-driven iteration to a more rigorous test-driven development approach using MLflow 3's native GenAI evaluation framework. This framework allows for the creation of comprehensive evaluation datasets and various LLM and deterministic scorers.
Gaps identified by testers were added to evaluation datasets, becoming acceptance criteria for fixes. This shift from subjective assessments like "it seems better" to objective validation, supported by experiment traces capturing prompts, retrieval strategies, and response metrics, significantly improved the system's reliability.
The architecture proved adaptable, with multilingual embedding models enabling LLMs to translate user prompts and responses. This suggests the core system can be expanded to other global markets with minimal modification.
A Foundation for Future Innovation
Mazda has established a repeatable foundation for GenAI applications, integrating structured and unstructured data within the lakehouse. Capabilities like vector search, model serving, evaluations, and catalog-level governance are now consolidated on a single platform.
This unified approach dramatically accelerated development, allowing two data scientists to manage complex AI applications with full traceability. The team is now expanding this framework to other diagnostic workflows and exploring generative agents to assist technicians and field engineers, moving beyond descriptive reporting to intelligent, generative applications built on governed enterprise data.
The project embodies Mazda's guiding principle of omotenashi—wholehearted hospitality—by empowering service agents with better tools to care for customers.