Search results

11 packages found

Calculate the token consumption and amount of openai gpt message

published version 1.3.14, 8 days ago5 dependents licensed under $MIT
20,203

⏳ js-tiktoken on Cloudflare Pages

published version 1.0.10, 3 days ago0 dependents licensed under $MIT
2,393

GPT token estimation and context size utilities without a full tokenizer

published version 0.4.1, 3 months ago0 dependents licensed under $MIT
2,325

Model Context Protocol server for Obsidian integration with token-aware response handling

published version 1.3.0, 22 days ago0 dependents licensed under $Apache-2.0
700

A service for calculating, managing, truncating openai prompt tokens

published version 2.3.6, 4 months ago0 dependents licensed under $MIT
471

Anthropic/AWS Bedrock Typescript SDK.

published version 0.0.5, a year ago0 dependents licensed under $MIT
74

GPT token estimation and context size utilities without a full tokenizer

published version 0.5.2, 6 months ago0 dependents licensed under $MIT
52

#### Description I needed the SentenceSplitter from llamaindex but had to import the entire llamaindex package which is 1GB. I pulled it out and had GPT make a standalone version. It's not exactly the same but close.

published version 1.2.0, 2 months ago0 dependents licensed under $ISC
37

Fast tokenizer.

published version 0.1.0, 4 months ago0 dependents licensed under $MIT
14

Pure JavaScript version of OpenAI tiktoken

published version 0.0.1, 2 years ago0 dependents licensed under $MIT
13

tiktoken for nest.js

published version 0.0.1, a year ago0 dependents licensed under $MIT
11