Seamlessly integrate Opik observability with your OpenAI applications to trace, monitor, and debug your LLM API calls.
- 🔍 Comprehensive Tracing: Automatically trace OpenAI API calls and completions
- 📊 Hierarchical Visualization: View your OpenAI execution as a structured trace with parent-child relationships
- 📝 Detailed Metadata Capture: Record model names, prompts, completions, token usage, and custom metadata
- 🚨 Error Handling: Capture and visualize errors in your OpenAI API interactions
- 🏷️ Custom Tagging: Add custom tags to organize and filter your traces
- 🔄 Streaming Support: Full support for streamed completions and chat responses
# npm
npm install opik-openai
# yarn
yarn add opik-openai
# pnpm
pnpm add opik-openai
- Node.js ≥ 18
- OpenAI SDK (
openai
≥ 4.0.0) - Opik SDK (automatically installed as a dependency)
import OpenAI from "openai";
import { trackOpenAI } from "opik-openai";
// Initialize the OpenAI client
const openai = new OpenAI({
apiKey: process.env.OPENAI_API_KEY,
});
// Wrap the client with Opik tracking
const trackedOpenAI = trackOpenAI(openai, {
// Optional configuration
traceMetadata: {
tags: ["production", "my-app"],
},
});
// Use the tracked client just like the original
async function main() {
const completion = await trackedOpenAI.chat.completions.create({
model: "gpt-4",
messages: [{ role: "user", content: "Hello world" }],
});
console.log(completion.choices[0].message);
// Flush traces at the end of your application
await trackedOpenAI.flush();
}
main().catch(console.error);
To view your traces:
- Sign in to your Comet account
- Navigate to the Opik section
- Select your project to view all traces
- Click on a specific trace to see the detailed execution flow
Apache 2.0