-
-
Notifications
You must be signed in to change notification settings - Fork 2.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FEAT]: Model Context Protocol (MCP) Integration #2883
Comments
I was tempted to have Claude write a heartfelt support missive to put here, but nah. MCP would be an amazing add to AnythingLLM. MCP is basically the AnythingLLM of the function calling space... pretty please? :) Peace. |
I recommend starting by connecting AnythingLLM to MCP via a custom agent skill plugin: ' agent-skills/mcp-connector/`. This isn't working quite right for me but maybe you can get it to work. // plugin.json
{
"active": true,
"hubId": "mcp-connector",
"name": "MCP Protocol Connector",
"schema": "skill-1.0.0",
"version": "1.0.0",
"description": "Connects to MCP-compatible tools and APIs",
"author": "If he makes it work, @GitDakky",
"license": "MIT",
"setup_args": {
"MCP_HOST": {
"type": "string",
"required": true,
"input": {
"type": "text",
"default": "localhost:3000",
"placeholder": "localhost:3000",
"hint": "Address of your MCP server"
}
}
},
"entrypoint": {
"file": "handler.js",
"params": {
"tool_name": {
"description": "Name of the MCP tool to call",
"type": "string"
},
"args": {
"description": "Arguments for the tool",
"type": "object"
}
}
},
"examples": [
{
"prompt": "Calculate 5 plus 3",
"call": "{\"tool_name\": \"calculate_sum\", \"args\": {\"a\": 5, \"b\": 3}}"
},
{
"prompt": "Get slideshow data",
"call": "{\"tool_name\": \"httpbin_json\", \"args\": {}}"
}
]
} // handler.js
module.exports.runtime = {
handler: async function ({ tool_name, args }) {
try {
// Log the attempt
this.logger(`Attempting to call MCP tool: ${tool_name}`);
this.introspect(`Connecting to MCP server at ${this.runtimeArgs.MCP_HOST}`);
// Make the MCP tool call
const response = await fetch(`http://${this.runtimeArgs.MCP_HOST}/tools/${tool_name}`, {
method: 'POST',
headers: {
'Content-Type': 'application/json'
},
body: JSON.stringify(args)
});
if (!response.ok) {
throw new Error(`MCP tool call failed: ${response.statusText}`);
}
const result = await response.json();
return JSON.stringify(result);
} catch (error) {
this.logger(`Error in MCP connector: ${error.message}`);
return `Failed to execute MCP tool: ${error.message}`;
}
}
}; # README.md
# MCP Protocol Connector
This custom agent skill allows AnythingLLM to interact with MCP-compatible tools and services.
## Setup
1. Install the skill in your AnythingLLM plugins directory
2. Configure the MCP_HOST in the agent skills settings
3. Ensure your MCP server is running and accessible
## Usage
The skill can call any tool registered with your MCP server:
Calculate sum example:
{
"tool_name": "calculate_sum",
"args": {
"a": 5,
"b": 3
}
}
Get external data example:
{
"tool_name": "httpbin_json",
"args": {}
}
## Requirements
- AnythingLLM running in supported environment
- Access to an MCP-compatible server
## Error Handling
The skill provides detailed error messages and logging for:
- Connection failures
- Invalid tool names
- Invalid arguments
- Server errors
## Logging
All tool calls are logged with:
- Tool name
- Arguments
- Success/failure status
- Error messages (if any)
## Support
If you encounter any issues:
1. Check the AnythingLLM logs
2. Verify your MCP server is running
3. Ensure the MCP_HOST configuration is correct Installation steps:
mkdir -p plugins/agent-skills/mcp-connector
# From the mcp-connector directory
touch plugin.json handler.js README.md
# Then paste the contents into each file
chmod 644 plugin.json README.md
chmod 755 handler.js
The skill will then be available in your AnythingLLM instance and can be configured through the UI. |
What would you like to see?
Description
Request to integrate Model Context Protocol (MCP) support into AnythingLLM to enhance interoperability and standardization of context handling across different LLM implementations.
Motivation
MCP is becoming a standard protocol for AI applications, offering a "USB-C port for AI applications" that would benefit AnythingLLM by:
Proposed Implementation
The implementation would require:
1. Core MCP Features
2. Integration Points
3. Security Considerations
Benefits
Technical Requirements
Additional Context
MCP is being adopted by various applications including:
The text was updated successfully, but these errors were encountered: