Support investigations are fundamentally research problems, where gathering context has historically been the slowest phase. Cursor has dramatically reshaped this by consolidating code, logs, team knowledge, and past conversations into a single AI session. This has removed that bottleneck for over 75% of its support interactions, boosting engineer throughput by 5-10x and marking a significant leap in support capabilities. Learn more about how Cursor's own support leverages its platform.
Starting from the Codebase
Investigations begin in Ask Mode, tracing symptoms backward through product behavior. With the full codebase indexed locally, Cursor performs semantic searches across code, docs, and internal tools simultaneously. Multi-root workspaces are crucial here, as product context often spans multiple repositories. A single thread can address complex questions involving frontend logic, backend checks, and documentation.
Integrating Support Sources with MCP
Cursor utilizes MCP servers to pull essential context directly into investigations, eliminating the need to search disparate tools. This integration brings in customer databases, streamed event logs, communication platforms like Slack, engineering ticket systems, internal documentation, and account management services. This comprehensive data access streamlines the process, as detailed by discussions on IBM Experts Unpack AI Agent Interoperability and WordPress builds a secure bridge for AI agents with MCP.
Identifying Failures and Similar Cases
When a customer reports an error, engineers must pinpoint the failure location (client-side, API, dependency, etc.). Datadog MCP integration allows relevant logs and traces to be pulled directly into the investigation thread. Similarly, MCPs connected to support platforms and Slack enable searching for similar past issues. The goal is to find the most relevant threads, including current status, workarounds, and assigned owners.
Bug Determination and Reporting
Distinguishing between bugs and expected behavior is a common challenge. Notion MCP can pull relevant runbooks into the investigation thread for cross-referencing. Once an issue is confirmed as a bug, the Linear MCP facilitates turning gathered context into a formatted escalation ticket directly from the same thread, as discussed in the context of Securing the Model Context Protocol: Agentforce Adds Enterprise Governance.
Documentation and Automation
Recurring customer questions highlight documentation gaps. Support engineers can directly initiate updates by mentioning @Cursor in Slack, triggering a cloud agent to open a PR against the docs repo. Common steps like creating tickets, drafting replies, and searching for known issues are automated via slash commands. Rules and Skills further streamline repeated processes, while subagents can run parallel tasks like log investigation, known-issue mining, ticket writing, and customer reply drafting, merging results for review.
AI-Native Technical Support
This AI-native approach integrates code research directly into technical support, yielding productivity gains estimated to be an order of magnitude greater than traditional methods. This efficiency allows Cursor's support team to effectively manage a rapidly scaling user base.
