Search results

1 package found

A browser-friendly library for running LLM inference using Wllama with preset and dynamic model loading, caching, and download capabilities.

published version 0.1.3, a month ago0 dependents licensed under $MIT
122