@mastra/core
TypeScript icon, indicating that this package has built-in type declarations

0.6.4 • Public • Published

@mastra/core

The core foundation of the Mastra framework, providing essential components and interfaces for building AI-powered applications.

Installation

npm install @mastra/core

Overview

@mastra/core is the foundational package of the Mastra framework, providing:

  • Core abstractions and interfaces
  • AI agent management and execution
  • Integration with multiple AI providers
  • Workflow orchestration
  • Memory and vector store management
  • Telemetry and logging infrastructure
  • Text-to-speech capabilities

For comprehensive documentation, visit our official documentation.

Core Components

Agents (/agent)

Agents are autonomous AI entities that can understand instructions, use tools, and complete tasks. They encapsulate LLM interactions and can maintain conversation history, use provided tools, and follow specific behavioral guidelines through instructions.

import { openai } from '@ai-sdk/openai';
import { Agent } from '@mastra/core/agent';

const agent = new Agent({
  name: 'my-agent',
  instructions: 'Your task-specific instructions',
  model: openai('gpt-4o-mini'),
  tools: {}, // Optional tools
});

More agent documentation →

Embeddings (/embeddings)

The embeddings module provides a unified interface for converting text into vector representations across multiple AI providers. These vectors are essential for semantic search, similarity comparisons, and other NLP tasks.

import { openai } from '@ai-sdk/openai';
import { embed } from 'ai';

const embeddings = await embed({
  model: openai.embedding('text-embedding-3-small'),
  value: 'text to embed',
});

Supported providers right now are OpenAI, Cohere, Amazon Bedrock, Google AI, Mistral, and Voyage.

More embeddings documentation →

Evaluations (/eval)

The evaluation system enables quantitative assessment of AI outputs. Create custom metrics to measure specific aspects of AI performance, from response quality to task completion accuracy.

import { Metric, evaluate } from '@mastra/core';

class CustomMetric extends Metric {
  async measure(input: string, output: string): Promise<MetricResult> {
    // Your evaluation logic
    return { score: 0.95 };
  }
}

More evaluations documentation →

Memory (/memory)

Memory management provides persistent storage and retrieval of AI interactions. It supports different storage backends and enables context-aware conversations and long-term learning.

import { MastraMemory } from '@mastra/core';

const memory = new MastraMemory({
  // Memory configuration
});

Note: this is the base MastraMemory class. This class in @mastra/core is intended to be extended when developing custom agent memory strategies. To use a premade memory strategy (recommended), with long and short term memory built in, use import { Memory } from "@mastra/memory" instead.

Visit the memory documentation to use Memory in your project →

Vector Stores (/vector)

Vector stores provide the infrastructure for storing and querying vector embeddings. They support semantic search, similarity matching, and efficient vector operations across different backend implementations.

import { MastraVector } from '@mastra/core';

class CustomVectorStore extends MastraVector {
  // Vector store implementation
}

More vector stores documentation →

Workflows (/workflows)

Workflows orchestrate complex AI tasks by combining multiple actions into a coherent sequence. They handle state management, error recovery, and can include conditional logic and parallel execution.

import { Workflow } from '@mastra/core';

const workflow = new Workflow({
  name: 'my-workflow',
  steps: [
    // Workflow steps
  ],
});

More workflows documentation →

Tools (/tools)

Tools are functions that agents can use to interact with external systems or perform specific tasks. Each tool has a clear description and schema, making it easy for AI to understand and use them effectively.

import { ToolAction } from '@mastra/core';

const tool = new ToolAction({
  name: 'tool-name',
  description: 'Tool description',
  execute: async context => {
    // Tool implementation
  },
});

More tools documentation →

Logger (/logger)

The logging system provides structured, leveled logging with multiple transport options. It supports debug information, performance monitoring, and error tracking across your AI applications.

import { createLogger, LogLevel } from '@mastra/core';

const logger = createLogger({
  name: 'MyApp',
  level: LogLevel.INFO,
});

More logging documentation →

Telemetry (/telemetry)

Telemetry provides OpenTelemetry integration for comprehensive monitoring of your AI systems. Track latency, success rates, and system health with distributed tracing and metrics collection.

import { Telemetry } from '@mastra/core';

const telemetry = Telemetry.init({
  serviceName: 'my-service',
});

More Telemetry documentation →

Additional Resources

Package Sidebar

Install

npm i @mastra/core

Weekly Downloads

15,591

Version

0.6.4

License

ISC

Unpacked Size

815 kB

Total Files

163

Last publish

Collaborators

  • smthomas
  • abhiaiyer
  • ifedayo
  • adeleke5140
  • taofeeq-deru
  • ehindero
  • calcsam
  • rase-
  • wardpeet
  • tylerbarnes
  • nikaiyer