uniai
TypeScript icon, indicating that this package has built-in type declarations

1.7.1 • Public • Published

UniAI

To Unify AI Models!

UniAI, built on Node.js, serves as an integrated AI model library. It provides a unified interface for various models, streamlining the development process by ensuring a consistent model input and output.

Chat

Imagine

Prompt: Pink dress, Candy, Sandy, Mandy, short hair, blonde hair, bangs, forehead, red lipstick, elbow gloves, hair accessories, high heels, sitting, cross legged, high chair, cocktail, holding cocktail glass, looking through the glass.
Negative Prompt: EasyNegative, badhandv4, badv5, aid210, aid291.
MidJourney Stability v1.6 OpenAI DALL-E-3
MidJourney Stability AI v1.6 DALL-E-3

Easy to Use

import UniAI from 'uniai'
// fill the config for the provider/model you want to use!
const ai = new UniAI({ OpenAI: { key: 'Your key', proxy: 'Your proxy API' } })
// chat model
const chat = await ai.chat('hello world')
// embedding model
const embedding = await ai.embedding('hello world')
// imagine model
const task = await ai.imagine('a panda is eating bamboo')
// show imagining tasks, get generated images
const image = await ai.task(task.taskId)
// change image, Midjourney only, return a new task
const task2 = await ai.change('midjourney', task.taskId, 'UPSCALE', 4)

English · 🇨🇳 中文说明

Supported Models

Applications Developed on UniAI

We have developed several sample applications using uniai:


Install

Using yarn:

yarn add uniai

Using npm:

npm install uniai

Example

We have written a simple call demo for you, which is placed in the /examples folder. You can read the /examples file directly to learn how to use UniAI. You can also read on to learn how to use UniAI based on the documentation.

List Models

You can use .models to list all the available models in UniAI.

TypeScript & JavaScript ES6+

import UniAI from 'uniai'

const ai = new UniAI()
console.log(ai.models)

JavaScript ES5

const UniAI = require('uniai').default

const ai = new UniAI()
console.log(ai.models)

Output

[
    {
        "provider": "OpenAI",
        "value": "openai",
        "models": [
            "gpt-3.5-turbo-1106",
            "gpt-3.5-turbo",
            "gpt-3.5-turbo-16k",
            "gpt-4",
            "gpt-4-32k",
            "gpt-4-1106-preview",
            "gpt-4-vision-preview"
        ]
    },
    // ...providers and models
    {
        "provider": "StabilityAI",
        "value": "stability.ai",
        "models": ["stable-diffusion-v1-6", "stable-diffusion-xl-1024-v1-0"]
    }
]

Chat

To interact with a model, use .chat() and remember to provide the required API key or secret parameters when initializing new UniAI().

Default model is OpenAI/gpt-3.5-turbo, put the OpenAI key and your proxy API.

const key: string | string[] = 'Your OpenAI Key (required), support multi keys'
const proxy = 'Your OpenAI API proxy (optional)'
const uni = new UniAI({ OpenAI: { key, proxy } })
const res = await uni.chat()
console.log(res)

Output

{
    "content": "I am OpenAI's language model trained to assist with information.",
    "model": "gpt-3.5-turbo-0613",
    "object": "chat.completion",
    "promptTokens": 20,
    "completionTokens": 13,
    "totalTokens": 33
}

Chat with image

const input = [
    {
        role: 'user',
        content: 'Describe this picture, is it a man or a woman, and what is she doing?',
        img: 'https://pics7.baidu.com/feed/1f178a82b9014a903fcc22f1e98d931fb11bee90.jpeg@f_auto?token=d5a33ea74668787d60d6f61c7b8f9ca2'
    }
]
// Warn: If you choose a non-image model, img attributes will be dropped!
const res = await ai.chat(input, { model: 'gpt-4-vision-preview' })
console.log(res)

Output

{
    "content": "The image shows a person taking a mirror selfie using a smartphone...",
    "model": "gpt-4-1106-vision-preview",
    "object": "chat.completion",
    "promptTokens": 450,
    "completionTokens": 141,
    "totalTokens": 591
}

Streaming Chat

For streaming chat, the response is a JSON buffer.

The following is an example to chat with Google gemini-pro in stream mode.

const key: string | string[] = 'Your Google Key (required), support multi keys'
const proxy = 'Your google api proxy (optional)'
const uni = new UniAI({ Google: { key, proxy } })
const res = await uni.chat(input, { stream: true, provider: ModelProvider.Google, model: GoogleChatModel.GEM_PRO })
const stream = res as Readable
let data = ''
stream.on('data', chunk => (data += JSON.parse(chunk.toString()).content))
stream.on('end', () => console.log(data))

Output (Stream)

Language model trained by Google, at your service.

Running Tests

UniAI uses jest to run unit tests on all supported models.

yarn test

If you want to run unit tests for a specific model provider:

# OpenAI, Google, Baidu, IFlyTek, MoonShot, GLM, Other, Imagine...
yarn test OpenAI

Thanks

Institute of Intelligent Computing Technology, Suzhou, CAS

Contributors

Youwei Huang

Weilong Yu

huangyw@iict.ac.cn

Who is using it

Project Brief introduction
UniAI MaaS UniAI is a unified API platform designed to simplify interaction with a variety of complex AI models.
LeChat Document analysis based on large language model, dialogue with WeChat Mini Programs.
LeChat Pro Full-platform client based on UniAI, multi-model streaming dialogue platform.

Star History

Star History Chart

License

MIT

Copyright (c) 2022-present, Youwei Huang

Package Sidebar

Install

npm i uniai

Weekly Downloads

87

Version

1.7.1

License

MIT

Unpacked Size

507 kB

Total Files

9

Last publish

Collaborators

  • devilyouwei