Model Context Protocol (MCP)
Triage supports the Model Context Protocol (MCP), so you can expose your Knowledge Bases as MCP servers. This gives your AI tools a standard, reliable way to tap into your company’s business context—without having to build your own pipelines and RAG.
What is MCP?
The Model Context Protocol (MCP) is an open standard that lets AI models safely access external data and tools. When you expose a Triage Knowledge Base as an MCP server, you create a bridge between your curated data (Jira, Slack, Confluence, etc.) and the AI agents you use every day.
Use cases
- BYO custom AI workflows
Go beyond the workflows Triage provides out of the box. Use agent builders like OpenAI’s AgentKit, n8n, or Langflow to wire Triage MCP servers into your own workflows—such as automated ticket classification, incident resolution, KCS article generation, and other support or operations automations.
- Enrich off-the-shelf AI applications
Bring your proprietary business context into the AI tools you already rely on by plugging Triage Knowledge Bases in as MCP servers:
* IDEs: Cursor, Windsurf, Warp
* Chat Apps: Claude, ChatGPT
How to connect
Each Knowledge Base in Triage exposes its own MCP server configuration:
- Navigate to your Knowledge Base in Triage.
- Click on the MCP icon.
- Choose your authentication mechanism:
Option A: OAuth
- Copy the MCP server URL and paste it in your AI tool's MCP configuration.
Option B: API Key-based authentication
- Create an API key: Click Settings from the left sidebar > API Keys > Create a new key.
- Copy the configuration JSON provided.
- Paste it into your AI tool's configuration file (for example,
claude_desktop_config.jsonor Cursor's MCP settings).
