@andrao/llm-client
1.0.1 • Public • Published
This repo provides a single interface for interacting with LLMs from Anthropic, OpenAI, Together.ai, and, locally, Ollama.
Function |
Description |
runChatCompletion |
Interoperable chat completion function |
Function |
Description |
getAnthropicClient |
Lazy-init an Anthropic SDK client |
getOllamaClient |
Lazy-init an Ollama client via the OpenAI SDK |
getOpenAIClient |
Lazy-init an OpenAI SDK client |
getTogetherClient |
Lazy-init a Togeter.ai client via the OpenAI SDK |
Readme
Keywords
nonePackage Sidebar
Install
Weekly Downloads