- published version 1.0.6, 2 years ago
A browser-friendly library for running LLM inference using Wllama with preset and dynamic model loading, caching, and download capabilities.
published version 0.1.3, 25 days ago
A browser-friendly library for running LLM inference using Wllama with preset and dynamic model loading, caching, and download capabilities.