🌐 MCP-Use-TS is the open source TypeScript library to connect any LLM to any MCP server and build custom agents that have tool access, without using closed source or application clients.
💡 Let developers easily connect any LLM to tools like web browsing, file operations, and more with full TypeScript support.
Feature | Description |
---|---|
🔄 Ease of use | Create your first MCP capable agent with just 6 lines of TypeScript code |
🤖 LLM Flexibility | Works with any LangChain supported LLM that supports tool calling (OpenAI, Anthropic, Groq, LLama etc.) |
🌐 HTTP Support | Direct connection to MCP servers running on specific HTTP ports |
⚙️ Dynamic Server Selection | TODO Agents can dynamically choose the most appropriate MCP server for a given task from the available pool |
🧩 Multi-Server Support | TODO Use multiple MCP servers simultaneously in a single agent |
🛡️ Tool Restrictions | TODO Restrict potentially dangerous tools like file system or network access |
📝 Type Safety | TODO Full TypeScript support with type definitions for all APIs and configurations |
With npm:
npm install mcp-use-ts
Or install from source:
git clone https://github.com/dforel/mcp-use-ts.git
cd mcp-use-ts
npm install
npm run build
mcp-use-ts works with various LLM providers through LangChain. You'll need to install the appropriate LangChain provider package for your chosen LLM. For example:
# For OpenAI
npm install @langchain/openai
# For Anthropic
npm install @langchain/anthropic
# For other providers, check the [LangChain chat models documentation](https://js.langchain.com/docs/integrations/chat/)
and add your API keys for the provider you want to use to your .env
file.
OPENAI_API_KEY=
ANTHROPIC_API_KEY=
Important: Only models with tool calling capabilities can be used with mcp-use-ts. Make sure your chosen model supports function calling or tool use.
import { config } from 'dotenv';
import { ChatOpenAI } from '@langchain/openai';
import { MCPAgent, MCPClient } from 'mcp-use-ts';
async function main() {
// Load environment variables
config();
// Create configuration object
const config = {
mcpServers: {
playwright: {
command: 'npx',
args: ['@playwright/mcp@latest'],
env: {
DISPLAY: ':1'
}
}
}
};
// Create MCPClient from configuration object
const client = MCPClient.fromConfig(config);
// Create LLM
const llm = new ChatOpenAI({ modelName: 'gpt-4' });
// Create agent with the client
const agent = new MCPAgent({
llm,
client,
maxSteps: 30
});
try {
// Run the query
const result = await agent.run(
'Find the best restaurant in San Francisco'
);
console.log('\nResult:', result);
} finally {
// Clean up resources
await client.closeAllSessions();
}
}
main().catch(console.error);
You can also add the servers configuration from a config file like this:
const client = MCPClient.fromConfigFile(
path.join(__dirname, 'browser_mcp.json')
);
Example configuration file (browser_mcp.json
):
{
"mcpServers": {
"playwright": {
"command": "npx",
"args": ["@playwright/mcp@latest"],
"env": {
"DISPLAY": ":1"
}
}
}
}
For other settings, models, and more, check out the documentation.
import { config } from 'dotenv';
import { ChatOpenAI } from '@langchain/openai';
import { MCPAgent, MCPClient } from 'mcp-use-ts';
import path from 'path';
async function main() {
// Load environment variables
config();
// Create MCPClient from config file
const client = MCPClient.fromConfigFile(
path.join(__dirname, 'browser_mcp.json')
);
// Create LLM
const llm = new ChatOpenAI({ modelName: 'gpt-4' });
// Alternative models:
// const llm = new ChatAnthropic({ modelName: 'claude-3-sonnet-20240229' });
// const llm = new ChatGroq({ modelName: 'llama3-8b-8192' });
// Create agent with the client
const agent = new MCPAgent({
llm,
client,
maxSteps: 30
});
try {
// Run the query
const result = await agent.run(
'Find the best restaurant in San Francisco USING GOOGLE SEARCH',
{ maxSteps: 30 }
);
console.log('\nResult:', result);
} finally {
// Clean up resources
await client.closeAllSessions();
}
}
main().catch(console.error);
This example demonstrates how to connect to an MCP server running on a specific HTTP port. Make sure to start your MCP server before running this example.
MCP-Use-TS allows configuring and connecting to multiple MCP servers simultaneously using the MCPClient
. This enables complex workflows that require tools from different servers, such as web browsing combined with file operations or 3D modeling.
You can configure multiple servers in your configuration file:
{
"mcpServers": {
"airbnb": {
"command": "npx",
"args": ["-y", "@openbnb/mcp-server-airbnb", "--ignore-robots-txt"]
},
"playwright": {
"command": "npx",
"args": ["@playwright/mcp@latest"],
"env": {
"DISPLAY": ":1"
}
}
}
}
The MCPClient
class provides methods for managing connections to multiple servers. When creating an MCPAgent
, you can provide an MCPClient
configured with multiple servers.
By default, the agent will have access to tools from all configured servers. If you need to target a specific server for a particular task, you can specify the serverName
when calling the agent.run()
method.
// Example: Manually selecting a server for a specific task
const airbnbResult = await agent.run(
'Search for Airbnb listings in Barcelona',
{ serverName: 'airbnb' } // Explicitly use the airbnb server
);
const googleResult = await agent.run(
'Find restaurants near the first result using Google Search',
{ serverName: 'playwright' } // Explicitly use the playwright server
);
If you only want to see debug information from the agent without enabling full debug logging, you can set the verbose
parameter when creating an MCPAgent:
// Create agent with increased verbosity
const agent = new MCPAgent({
llm,
client,
verbose: true // Only shows debug messages from the agent
});
This is useful when you only need to see the agent's steps and decision-making process without all the low-level debug information from other components.
- [x] Multiple Servers at once
- [x] Test remote connectors (http, ws)
- [ ] ...
We love contributions! Feel free to open issues for bugs or feature requests.
- Node.js 18+
- TypeScript 5.0+
- MCP implementation (like Playwright MCP)
- LangChain and appropriate model libraries (OpenAI, Anthropic, etc.)
If you use MCP-Use-TS in your research or project, please cite:
@software{mcp_use_ts,
author = {dforel},
title = {MCP-Use-TS: MCP Library for TypeScript},
year = {2025},
publisher = {GitHub},
url = {https://github.com/dforel/mcp-use-ts}
}
this project is a fork of mcp-use
i hope you enjoy it
MIT