duck4i
Native Node.JS plugin to run LLAMA inference directly on your machine with no other dependencies.
published version 0.3.0, 4 months ago- published version 0.2.9, 4 months ago
- published version 0.2.10, 4 months ago
- published version 0.2.11, 4 months ago
This is mostly for testing client and server, but is also included in the docker image.
published version 0.2.14, 4 months agoPlease note that this package is part of larger turbo monorepo.
published version 0.4.2, 4 months agoCompress Lottie files to reduce file size
published version 0.1.1, 3 months ago