u/iwannaredditonline

Nvidia GPU cuda missing in Openwebui Desktop version of Linux

Hey guys,

Currently trying to setup the native version of Openwebui with the desktop version. I installed many different file versions and app build versions (appimage and flatpak) and I do see references of "Nvidia" when its installing, but when I go to the menu of Llama.cpp inside the app the Nvidia / Cuda is not an option. The only options are CPU, Rocm and the other two. I enabled the sofware with full permissions but it still doesnt show as an option. I tried on Windows and both Cuda 12 and 13 were an option. I am using Fedora 43. Issue seems isolated to linux and I havent seen anyone else mention it.

Are you guys having this issue? Is there a way to modify the llama install to enable/recompile it instead of having to do a separate install of llama and adding it manually in the server area. I currently us LM Studio and want to scrap it to use openwebui directly without additional software/overhead.

Specs

Fedora 43

RTX 5090 32gb - Nvidia Drivers NVIDIA-SMI 595.71.05 Driver Version: 595.71.05 CUDA Version: 13.2 (direct repo not rpm fusion)

reddit.com
u/iwannaredditonline — 18 hours ago