r/NukeVFX

Logo replacement on a heavily waving flag, whats your current workflow?

Logo replacement on a heavily waving flag, whats your current workflow?

Hey everyone, I've got a shot where a flag is waving in strong wind, lots of chaos, heavy rotation, the flag frequently folds over itself and the backside (which also has a logo) briefly comes into view. Client wants the logo replaced.

I'm primarily a Houdini artist these days and don't do compositing full-time, so I'm not always 100% up to date on the latest workflows for this kind of thing. Would love to hear what you'd reach for in 2026.

For reference, the shot is pretty similar to this: https://www.youtube.com/watch?v=gskOfgYBUjg except my flag isn't semi-transparent, it's a solid black flag with a white logo.

For similar problems in the past I've used EbSynth on a T-shirt logo replacement where the logo was frequently hidden by folds and wrinkles, worked reasonably well but was pretty tedious to manage. I've also had good results with ComfyUI inpainting when I needed to remove an object and had no information about what was underneath.

The main challenges here are the heavy chaotic motion with lots of self-occlusion, and the fact that the backside of the flag is also visible at times, so essentially two surfaces to deal with. At least the flag is fully opaque so no transparency issues.

Are there AI-assisted approaches that actually hold up well in this kind of chaos? Curious if anyone had to manage a shot like this before or whether EbSynth and ComfyUI-based approaches are actually viable here.

Thanks in advance!

u/hbskr — 10 hours ago

Rec709 to Rec709 ACES workflow

Hi, how are you guys getting Rec709 plates through ACES (ACEScg working space for CG element composite) without any gamma shift happening on export?

In the past I swear I have always just set the input transform to Output - Rec709 and then again on the write node set the output transform to Output - Rec709, and the render has perfectly matched the plate.

Now however, in Nuke 16 / ACES 1.2 when I do the same it results in the export looking washed out compared to the original plate. Testing 1.3 as well I can’t seem to get the output to match the plate perfectly.

How do you guys deal with Rec709 plates correctly?

reddit.com
u/cinematic_flight — 2 days ago

Transfer black and white EXR into alpha.

Hi all,
Using After effects I made a mask and exported it as an EXR.

How can I turn the black and white value from that EXR into an alpha mask in NUKE?

Any assistance would be greatly appreciated.

u/Comprehensive_Fun_76 — 4 days ago

A pan shot 3d camera tracking best method (shot taken from Samsung A52 smartphone with wide angle camera)

Hi i am new to 3d camera tracking world. have spent days trying to understand the technical concepts of camera like focal length, aperture, lens size, optical center etc. so there is a pan shot i shot wih locked focus and locked exposure in 4k 30 fps. then i tried using after effects camera traking, blender camera tracking, using COLMAP photogramettery point cloud AND the animated camera path. each one of them have problems.

at first ae tracker is does track te fooage but with no depth so the solve error remains pto 4 px. pretty ok still camera in ae jitters, lags, slides.

blender tracker is trash loke absolute crap.

then i tried COLMAP photogrammetry sokution. problem is that instead of rotating the camer from the pivot point, it aimated the camers to follow a huge path which has got many orientation problems like coordination scaling issue. because when i place 3d object into the scene, the 3d object slides and PANS more then the real life camera. and because of lack of depth, the colmap actully created a panoramic 2.5 D point cloud instead of properly constructing a whole photogammetry 3d scene. so essentially alignments are imperfect in colmap.

so i need a perfect workflow. i even tried to scale down the point cloud to match the coordinate system perfectly. actually i used real life measurements like the distance form camera to the green belt, road width, distance from gorund to camera. but still the 3d objects seem to be sliding and panning more than the camera even tho the points form the point cloud is exactly sticked to their respective positions throughout the entore scene. can you pelase suggest any better workflow? should i use Nuke 3d camera tracker? i basically need to attach teh car 3d model (park) on the green belt)

reddit.com
u/Wise_Huckleberry_902 — 2 days ago

Nuke Studio - Lining up vfx shots to multiplte timelines

In nuke Studio, how can I match up a comp layer I've already created for a clip in one timeline to a different timeline with the same source clip but with different in and out points more easily?

So let's say I get 3 edits -

1x60s

1x30s

1x20s

I've conformed the 60s edit and created nuke scripts for all the vfx shots. Now I want to move on to the 30+20s timelines that reuse most of the vfx shots but with different in and out points/frame ranges. I don't want to make duplicate nuke comps of course for these shots so I have get the vfx shots layers over from the 60s timeline and make sure they match timing wise.

I could of course bruteforce this and manually copy/paste every comp layer from the 60s and then match them up and extend for missing frames in either the heads or tails. But does anyone have any python code/tool they use that makes this process easier? Like right clicking a source clip in one of the cutdowns and retrieving that vfx shot using the same name from a different timeline in the project and align it timing wise to quickly see if there are frames missing.

I'm not very python savy which most definitely the reason I haven't figured this out on my own. I work solo at home and would make life so much easier for projects with tons of timeline edits.

reddit.com
u/freckt — 1 day ago

Shadow layer making scene lighter/blurry and 2D asset disappears

Hello everyone

I am a first year student studying animation and VFX and this is the first composition we have been assigned to do following along tutorials made by my lecturer. So I am a complete beginner but slowly understanding how nuke works.

However I am currently struggling to get my shadow layer and the Cloud.png to show at the same time. The shadow layer is also causing the scene to become much brighter and blurry as you can see going from the cloud merge node to the shadow merge node. I have been trying to use Gemini for help but it's been all a bit confusing for me. Have I got the nodes connected wrong, have I got a setting that I shouldn't be using or is their another node I can plug in to fix this? Any help would be greatly appreciated, if you need anymore information please just ask.

Thank you for your time.

u/Cammie_98 — 1 day ago

Hello,

I have a shot where there's a 3 dimensional object and have geo for it to hold out the layers of dust and debris. The problem is the geo isn't perfectly accurate to what is in the plate. But I have 2d roto that perfect.

The problem: using deep hold out how can I use 2d roto to basically reproject and properly hold out the different depths? I know I can do this with like a 2d car if it's a simple holdout, but how can I do it if it's a holdout that has 3 dimentions?

The issue: idk how to project that 2d roto on 3d geometry, and even if I did that, the 3d geometry isn't accurate like mentioned before so it would cut off my roto in areas,bits like the 3d geometry would need to be eroded in 3d space to account for that lol but idk if that kind of thing is even possible.

Thanks for any help!

reddit.com
u/sevenumb — 6 days ago

I’m currently testing the Samsung Odyssey 3D monitor for stereo work in Nuke, but I’m facing a few issues:

  • Stereo depth breaks while zooming in/out in Viewer
  • Alignment feels inconsistent during compositing
  • Hard to judge accurate depth for roto and paint

For gaming and media it looks impressive, but for professional stereo compositing I’m not sure if it’s reliable enough.

Has anyone successfully used a glasses-free monitor in a real stereo pipeline with plugin?
Or is passive 3D still the better option for Nuke work?

Would love to hear your experience and monitor recommendations.

reddit.com
u/DesiDiscovery — 8 days ago
▲ 6 r/NukeVFX+5 crossposts

Animation Pipeline / Workflow Research — Looking for Industry Experiences & Insights

Hi!
I’m an animation student at the Filmuniversität Babelsberg in Germany, currently developing an early-stage research/project idea around animation production pipelines and workflow infrastructure.

I’m especially interested in the less visible parts of production — scene prep, handoff friction between departments, pipeline organization, communication breakdowns, technical setup, production tooling, etc. Basically the invisible machinery that shapes your work behind the scenes.

I put together a survey aimed at animators, rigging artists, compositors, TDs, production staff, motion designers, pipeline people and anyone else involved in animation workflows.

https://forms.gle/jqTSDCL9mDXJAuGn6

Would genuinely love to hear about real-world experiences, frustrations,workarounds, recurring production issues or whatever else comes to mind from people actually working in the field.

I hope this is the right place to share this :)

Thanks!

u/raccooonboy — 5 days ago

I'm trying to set up a relighting pipeline entirely inside Nuke on Mac for live-action footage — no 3D passes, just regular filmed material.

The main thing I'm trying to figure out is how to generate a usable normal map from footage so I can do proper relighting without any 3D render data. I've been looking into ML-based solutions (like DSINE) but I'm curious what people are actually using in production.

A few specific questions:

  1. What do you use to generate normal maps from live footage in Nuke? (ML nodes, gizmos, external tools that pipe back in?)
  2. What's your actual relighting setup inside Nuke? Spherical harmonics? Light direction Grade nodes? Something more sophisticated?
  3. Is there a solid Cattery plugin or gizmo that handles this well on Mac (no CUDA GPU)?
  4. Do people bother doing this fully inside Nuke or do you always export to another app? I want to stay in Nuke if possible and avoid After Effects or Resolve for this step.
  5. Any tips on deflickering the normal map output when working on moving footage?

For context: working on a Mac (Apple Silicon), NukeX 16.0v4, footage is standard live-action with no depth or normal passes from a 3D package.

Would love to hear real-world workflows, not just the theory. Thanks!

reddit.com
u/copticopay — 7 days ago

I did use these type of tools before at some place, but can't exactly remember what they were called, some smart gizmos that would take a rough alpha input, such as a slightly noisy key and give you back a filtered, sharp smooth edge.

Any suggestions ? I've been searching through google / nukepedia but can't seem to find the right tool.

reddit.com
u/sdfgsdgsdfgsdgsdg — 11 days ago