Anyone running Mimo-v2.5 quants with multimodal and MTP?
Has anyone been able to run Q4 or Q5 of XiaomiMiMo/MiMo-V2.5, with functioning multimodal capability as well as MTP, through llamacpp? Only AesSedai’s gguf quants appear to have mmproj, and it is unclear if it has MTP layers preserved or not.
I have only 40gb of vram, but 256gb of 4-channel ddr4 ram, so I’m not expecting any great inference speed, but I’m intrigued by the model’s strength and multimodal capabilities so wanted to give it a go. Looks like MTP on llamacpp is still in draft branch, so I’ll have to use that it seems.