onenm_local_llm
v0.1.5 · MIT License
Flutter plugin for on-device LLM inference on Android using llama.cpp. Simplifies model management, loading, and multi-turn chat — no cloud, no API keys, fully offline.
#13136 overall android
Downloads / 30d
97
as of 2026-05-06
GitHub stars
—
no github URL
Likes
4
on pub.dev
Pub points
160 / 160
perfect score
1-day trend
Monthly downloads, sampled daily at 03:00 UTC 2026-05-06 2026-05-06
Top packages
Other libraries in the same category.