This is the langchain wrapper built for Maxim JS SDK.
npm install @maximai/maxim-js-langchain
const maxim = new Maxim({apiKey: "maxim-api-key"});
const logger = await maxim.logger({ id: "log-repository-id" });
const maximTracer = new MaximLangchainTracer(logger);
const llm = new ChatOpenAI({
openAIApiKey: openAIKey,
modelName: "gpt-4o",
temperature: 0,
callbacks: [maximTracer],
metadata: {
maxim: { // optional: add metadata to the generation
generationName: "maths-gen",
generationTags: { tag: "test" }
}
}
});
const query = "What's the sum of 3 and 2?";
const result = await llm.invoke(query);
Anthropic | Bedrock Anthropic | Bedrock Meta | OpenAI | Azure | |
---|---|---|---|---|---|
Chat (0.3.x) | ✅ | ✅ | ✅ | ✅ | ✅ |
Tool call (0.3.x) | ✅ | ✅ | ⛔️ | ✅ | ✅ |
Chain (via LLM) (0.3.x) | ✅ | ✅ | ✅ | ✅ | ✅ |
Streaming (0.3.x) | ✅ | ✅ | ✅ | ✳️ | ✳️ |
Agent (0.3.x) | ⛔️ | ⛔️ | ⛔️ | ⛔️ | ⛔️ |
✳️ Token usage is not supported by Langchain
- Feat: adds LangGraph support
- Feat: add support for tool call logging via callbacks
- Fix: token usage capturing for anthropic models
- Fix: model name and provider parsing while using Langchain
- Fix: Peer dependency changed from langchain to @langchain/core
- Chore: tsconfig target moved to es2017
- Feat: Updates peer dep for Maxim SDK
- Feat: Adds support for langchain 3.x
- Improvement: easy logging of retrieval and generation
- Breaking: Dropping support for Langchain versions below 0.3.x
- Improvement: Changed pining for core Maxim SDK
- Fix: Token usage were not getting captured for streaming responses
- Fix: Adds empty tags as {}
- Early preview