▲ 1 r/LocalLLM
fast model for testing where accuracy doesn't matter
as title. i have ollama on the old underpowerd box. i need a model that is quick, not resource intensive and isn't required to do deep or heavy thinking. speed over ability.
u/Lower-Impression-121 — 15 hours ago