openai-mcp
TypeScript icon, indicating that this package has built-in type declarations

0.0.7 • Public • Published

openai-mcp

A TypeScript library that provides an OpenAI-compatible client for the Model Context Protocol (MCP).

Installation

npm install openai-mcp

Features

  • OpenAI API compatibility - works as a drop-in replacement for the OpenAI client
  • Connects to local or remote Model Context Protocol servers
  • Supports tool use and function calling
  • Rate limiting and retry logic built in
  • Configurable logging
  • TypeScript type definitions included

Usage

import { OpenAI } from 'openai-mcp';

// Create an OpenAI-compatible client connected to an MCP server
const openai = new OpenAI({
  mcp: {
    // MCP server URL(s) to connect to
    serverUrls: ['http://localhost:3000/mcp'],
    
    // Optional: set log level (debug, info, warn, error)
    logLevel: 'info',
    
    // Additional configuration options
    // modelName: 'gpt-4',             // Default model to use
    // disconnectAfterUse: true,       // Auto-disconnect after use
    // maxToolCalls: 15,               // Max number of tool calls per conversation
    // toolTimeoutSec: 60,             // Timeout for tool calls
  }
});

// Use the client like a standard OpenAI client
const response = await openai.chat.completions.create({
  model: 'gpt-4',
  messages: [
    { role: 'system', content: 'You are a helpful assistant.' },
    { role: 'user', content: 'Hello, how are you today?' }
  ]
});

console.log(response.choices[0].message.content);

Logging Configuration

import { setMcpLogLevel } from 'openai-mcp';

// Set log level to one of: 'debug', 'info', 'warn', 'error'
setMcpLogLevel('info');

Environment Variables

The library also supports configuration through environment variables:

# MCP Server URL(s) - comma separated for multiple servers
MCP_SERVER_URL=http://localhost:3000/mcp

# API Keys for different model providers
OPENAI_API_KEY=your-openai-api-key
ANTHROPIC_API_KEY=your-anthropic-api-key
GEMINI_API_KEY=your-gemini-api-key

Multi-Model Support

The library supports routing requests to different model providers based on the model name:

import { OpenAI } from 'openai-mcp';

const openai = new OpenAI();

// Uses OpenAI API
const gpt4Response = await openai.chat.completions.create({
  model: 'gpt-4',
  messages: [{ role: 'user', content: 'Hello GPT-4' }]
});

// Uses Anthropic API
const claudeResponse = await openai.chat.completions.create({
  model: 'claude-3',
  messages: [{ role: 'user', content: 'Hello Claude' }]
});

// Uses Google Gemini API
const geminiResponse = await openai.chat.completions.create({
  model: 'gemini-pro',
  messages: [{ role: 'user', content: 'Hello Gemini' }]
});

Examples

The examples/ directory contains various usage examples:

See the Examples README for more details on running these examples.

Development

To build the library:

npm run build

To run tests:

npm test

Package Sidebar

Install

npm i openai-mcp

Weekly Downloads

9

Version

0.0.7

License

MIT

Unpacked Size

1.24 MB

Total Files

54

Last publish

Collaborators

  • paulmeller