llm_llamacpp
v0.1.9 · MIT License
llama.cpp backend implementation for LLM interactions. Enables local on-device inference with GGUF models on Android, iOS, macOS, Windows, and Linux.
#12431 overall androidioswindowslinuxmacos
Downloads / 30d
115
as of 2026-05-06
GitHub stars
—
no github URL
Likes
1
on pub.dev
Pub points
160 / 160
perfect score
1-day trend
Monthly downloads, sampled daily at 03:00 UTC 2026-05-06 2026-05-06
Top packages
Other libraries in the same category.