Unraid shows ZFS pools as “unmountable” and asks to format, but pools are healthy — how do I fix this?
As you can see it sees the zfs pools
Hi all,
I’m running into an issue after migrating from a traditional Unraid array to a full ZFS setup, and I’m hoping someone here has seen this before.
From the ZFS side, everything looks completely healthy — the pools are imported, visible, and functioning as expected. However, in the Unraid UI, those same disks are being marked as unmountable and Unraid is prompting me to format them.
This seems to have been caused by a mistake I made during the migration/configuration process, but I’m not sure how to cleanly recover from it. This happen when I was moving the last half of disks from array and the new config where I accidentally reset its understand of the zfs setup.
The current state:
- ZFS pools are healthy and working
- Unraid incorrectly flags them as unmountable
- UI prompts to format (which I obviously want to avoid)
- On reboot, the pools don’t always mount cleanly
- I sometimes have to stop/start the array to get things working again
What I’m trying to do is essentially “reset” Unraid’s view of these disks so it recognizes that:
- the pools are valid
- no formatting is needed
- they should be treated as part of the system properly
Has anyone run into this before, or knows how to clear/reset Unraid’s disk state without risking data loss? Please see screenshots above.
Appreciate any guidance — I’d really like to get this into a stable state.
Thanks!