@llamaindex/autotool

0.0.1 • Public • Published

@llamaindex/autotool

Auto transpile your JS function to LLM Agent compatible

Usage

First, Install the package

npm install @llamaindex/autotool
pnpm add @llamaindex/autotool
yarn add @llamaindex/autotool

Second, Add the plugin/loader to your configuration:

Next.js

import { withNext } from "@llamaindex/autotool/next";

/** @type {import('next').NextConfig} */
const nextConfig = {};

export default withNext(nextConfig);

Node.js

node --import @llamaindex/autotool/node ./path/to/your/script.js

Third, add "use tool" on top of your tool file or change to .tool.ts.

"use tool";

export function getWeather(city: string) {
  // ...
}
// ...

Finally, export a chat handler function to the frontend using llamaindex Agent

"use server";

// imports ...

export async function chatWithAI(message: string): Promise<JSX.Element> {
  const agent = new OpenAIAgent({
    tools: convertTools("llamaindex"),
  });
  const uiStream = createStreamableUI();
  agent
    .chat({
      stream: true,
      message,
    })
    .then(async (responseStream) => {
      return responseStream.pipeTo(
        new WritableStream({
          start: () => {
            uiStream.append("\n");
          },
          write: async (message) => {
            uiStream.append(message.response.delta);
          },
          close: () => {
            uiStream.done();
          },
        }),
      );
    });
  return uiStream.value;
}

License

MIT

Readme

Keywords

none

Package Sidebar

Install

npm i @llamaindex/autotool

Weekly Downloads

12

Version

0.0.1

License

none

Unpacked Size

119 kB

Total Files

55

Last publish

Collaborators

  • marcusschiesser
  • emanuelclferreira
  • himself_65
  • octopoedi
  • yisding