u/Dependent-Cow7823

Image 1 —
Image 2 —
▲ 9 r/Ubuntu

I tried the new snap local LLM feature with qwen. I couldn't get it to do more than chat.

  1. It's extremely fast. The answers are almost instant, faster than if I was using Ollama. Although sometimes it would go off the rails and repeat itself.
  2. I hate this AI creep because of the amount of resources it uses, about 2GB of ram to idle.
u/Dependent-Cow7823 — 16 days ago