Core OpenTelemetry instrumentation library for TypeScript applications with advanced evaluation capabilities for AI systems.
@traceai/fi-core
provides a comprehensive tracing solution built on OpenTelemetry that's specifically designed for AI applications. It offers custom span exporters, evaluation tags, and seamless integration with the TraceAI platform for observability and performance monitoring.
- OpenTelemetry Integration: Built on top of OpenTelemetry APIs with custom implementations
- Custom Span Exporter: HTTP-based span exporter with configurable endpoints
- AI Evaluation Tags: Comprehensive evaluation system for AI applications with 50+ built-in evaluators
- Project Management: Support for project versioning, sessions, and metadata
- Flexible Configuration: Environment variable and programmatic configuration support
- TypeScript Support: Full TypeScript support with comprehensive type definitions
npm install @traceai/fi-core
# or
pnpm add @traceai/fi-core
# or
yarn add @traceai/fi-core
This package supports both CommonJS and ESM module systems for maximum compatibility.
import { register, ProjectType, EvalTag } from '@traceai/fi-core';
const { register, ProjectType, EvalTag } = require('@traceai/fi-core');
For optimal compatibility, ensure your tsconfig.json
includes:
{
"compilerOptions": {
"moduleResolution": "node",
"esModuleInterop": true,
"allowSyntheticDefaultImports": true
}
}
The module
setting can be "commonjs"
, "esnext"
, or any other module system your project requires.
import { register, ProjectType } from '@traceai/fi-core';
// Initialize tracing with minimal configuration
const tracerProvider = register({
projectName: 'my-ai-project',
projectType: ProjectType.EXPERIMENT,
});
import { register, ProjectType, EvalTag, EvalName, EvalSpanKind } from '@traceai/fi-core';
// Create evaluation tags for AI model monitoring
const evalTags = [
new EvalTag({
type: EvalTagType.OBSERVATION_SPAN,
value: EvalSpanKind.LLM,
eval_name: EvalName.CONTEXT_ADHERENCE,
custom_eval_name: 'custom_context_check',
mapping: {
"context": "raw.input",
"output": "raw.output"
},
model: ModelChoices.TURING_SMALL
})
]
// Register with comprehensive configuration
const tracerProvider = register({
projectName: 'advanced-ai-project',
projectType: ProjectType.EXPERIMENT,
projectVersionName: 'v1.0.0',
evalTags: evalTags,
sessionName: 'experiment-session-1',
verbose: true,
});
The FITracerProvider
extends OpenTelemetry's BasicTracerProvider
with custom functionality:
- Custom HTTP span exporter
- Automatic resource detection and configuration
- Built-in UUID generation for trace and span IDs
- Configurable batch or simple span processing
The evaluation system provides comprehensive AI model assessment capabilities:
- Content Quality: Context adherence, completeness, groundedness
- Safety & Moderation: Toxicity, PII detection, content moderation
- Technical Validation: JSON validation, regex matching, length checks
- AI-Specific: Conversation coherence, prompt injection detection
- Custom Evaluations: API calls, custom code evaluation, agent-as-judge
-
LLM
: Large Language Model operations -
AGENT
: AI agent executions -
TOOL
: Tool usage and function calls -
RETRIEVER
: Information retrieval operations -
EMBEDDING
: Vector embedding operations -
RERANKER
: Result reranking operations
-
EXPERIMENT
: For experimental AI development and testing -
OBSERVE
: For production monitoring and observability
-
FI_BASE_URL
: Base URL for the TraceAI collector -
FI_API_KEY
: API key for authentication -
FI_SECRET_KEY
: Secret key for authentication
import { trace } from '@opentelemetry/api';
import { register, ProjectType } from '@traceai/fi-core';
// Initialize
register({
projectName: 'llm-chat-bot',
projectType: ProjectType.EXPERIMENT,
evalTags: [
new EvalTag({
type: EvalTagType.OBSERVATION_SPAN,
value: EvalSpanKind.LLM,
eval_name: EvalName.CHUNK_ATTRIBUTION,
config: {},
custom_eval_name: "Chunk_Attribution_5",
mapping: {
"context": "raw.input",
"output": "raw.output"
},
model: ModelChoices.TURING_SMALL
})
]
});
// Create traces
const tracer = trace.getTracer('my-app');
const span = tracer.startSpan('llm-completion');
span.setAttributes({
'llm.model': 'gpt-4o-mini',
'llm.prompt': 'What is the capital of France?',
'llm.response': 'The capital of France is Paris.'
});
span.end();
pnpm build
pnpm test
pnpm lint
This package is part of the TraceAI project. Please refer to the main repository for contribution guidelines.