
are we getting new Flux models soon? hopefully open source. Would love a new klein model
link to post

are we getting new Flux models soon? hopefully open source. Would love a new klein model
link to post
You can start with the two models, Eros, which is better for I2V, and Sulphur, which works for both I2V and T2V. If you don't know what any of that means, you've got a long road ahead of you, but I promise it'll be worth it in the end.
This is not an ad and this is not a paid service. You can run this on your PC for free, right now. Just letting ya'll know that you no longer have to bother with Grok. The video I attached below was first attempt that I generated on my PC in <5 minutes.
NSFW warning:
EDIT: I've seen a lot of people saying you need a 4090 or 5090 to run LTX, and that's just not true. You can run it on much weaker hardware, the real question is how much you're willing to compromise on speed, resolution, and workflow setup.
For normal use, 12GB of VRAM is a solid baseline. A 3060 12GB or anything better is enough to get started, and people have even managed to run LTX on 8GB cards or lower with quantization and other tricks, but that's more of a technical workaround than something I'd recommend if you want a smooth experience.
RAM matters a lot too, and people keep ignoring that part. I'd treat 32GB as the bare minimum, while 48GB or 64GB is a much better place to be, especially if you don't want your system constantly leaning on pagefile and slowing everything down. If you're using a slow drive, it's even worse.
ComfyUI has also improved a lot here. It can offload parts of the workflow between VRAM and system memory, which is why cards that look too weak on paper can still run models they technically shouldn't fit, just much slower.
So no, you do not need some insane flagship GPU to use LTX. What stronger hardware really buys you is speed and less pain. For reference, I'm on a 5070 Ti and a 10-second 720p video still takes me around 5 minutes to generate.