

▲ 9 r/Ubuntu
I tried the new snap local LLM feature with qwen. I couldn't get it to do more than chat.
- It's extremely fast. The answers are almost instant, faster than if I was using Ollama. Although sometimes it would go off the rails and repeat itself.
- I hate this AI creep because of the amount of resources it uses, about 2GB of ram to idle.
u/Dependent-Cow7823 — 16 days ago