create-localai

0.1.14 • Public • Published

LocalAI Framework

Build AI apps faster—no LLM API keys, no cloud costs, no boilerplate. Just code.

LocalAI Framework is a zero-config, AI-native web framework that makes it easy to build applications with embedded LLMs. It provides a unified API for text generation, embeddings, and agentic workflows, all running locally on your machine.

Features

  • 🚀 Zero Configuration: Get started in seconds with our CLI
  • 🤖 Embedded LLM: Ships with TinyLlama for instant local inference
  • 🔌 Unified API: Simple React hooks for AI functionality
  • 💻 Local-First: No API keys or cloud costs required
  • 🔄 Hybrid Mode: Optional cloud provider fallback
  • 🛠 Developer Tools: Built-in AI playground and performance monitoring

Quick Start

# Create a new project
npx create-localai@latest my-ai-app

# Navigate to the project
cd my-ai-app

# Start the development server
npm run dev

Usage

import { useLLM } from '@localai/framework';

function MyAIComponent() {
  const { generate, isLoading } = useLLM();

  const handleClick = async () => {
    const response = await generate({
      prompt: "Write a short sci-fi story."
    });
    console.log(response.text);
  };

  return (
    <button onClick={handleClick} disabled={isLoading}>
      Generate Story
    </button>
  );
}

Configuration

// _app.tsx or similar
import { LLMProvider } from '@localai/framework';

function MyApp({ Component, pageProps }) {
  return (
    <LLMProvider config={{ model: 'tinyllama', temperature: 0.7 }}>
      <Component {...pageProps} />
    </LLMProvider>
  );
}

Advanced Features

Agentic Workflows

import { defineAgent } from '@localai/framework';

const CodeAgent = defineAgent({
  role: "Senior Developer",
  tools: ['writeFile', 'runTests'],
  model: "phind-codellama"
});

// Use the agent
const result = await CodeAgent.execute("Refactor this function to use async/await");

RAG (Coming Soon)

import { useRAG } from '@localai/framework';

const { query } = useRAG({
  documents: ['doc1.pdf', 'doc2.pdf'],
  model: 'tinyllama'
});

const answer = await query("What do the documents say about X?");

Contributing

We welcome contributions! Please see our Contributing Guide for details.

License

MIT © LocalAI Team

Support

/create-localai/

    Package Sidebar

    Install

    npm i create-localai

    Homepage

    localai.dev

    Weekly Downloads

    12

    Version

    0.1.14

    License

    MIT

    Unpacked Size

    241 kB

    Total Files

    31

    Last publish

    Collaborators

    • marcdhi