I'm trying to set up a relighting pipeline entirely inside Nuke on Mac for live-action footage — no 3D passes, just regular filmed material.
The main thing I'm trying to figure out is how to generate a usable normal map from footage so I can do proper relighting without any 3D render data. I've been looking into ML-based solutions (like DSINE) but I'm curious what people are actually using in production.
A few specific questions:
- What do you use to generate normal maps from live footage in Nuke? (ML nodes, gizmos, external tools that pipe back in?)
- What's your actual relighting setup inside Nuke? Spherical harmonics? Light direction Grade nodes? Something more sophisticated?
- Is there a solid Cattery plugin or gizmo that handles this well on Mac (no CUDA GPU)?
- Do people bother doing this fully inside Nuke or do you always export to another app? I want to stay in Nuke if possible and avoid After Effects or Resolve for this step.
- Any tips on deflickering the normal map output when working on moving footage?
For context: working on a Mac (Apple Silicon), NukeX 16.0v4, footage is standard live-action with no depth or normal passes from a 3D package.
Would love to hear real-world workflows, not just the theory. Thanks!