r/TouchDesigner

▲ 902 r/TouchDesigner+1 crossposts

Flux.2-Klein pipeline for real-time webcam stream processing in 30 FPS

I have built a pipeline based on the Flux.2-Klein-4B model that allows processing of a video stream with low latency (about 0.2 seconds) on a single RTX5090 GPU.
It is free and open-source, you can try it locally:
https://github.com/tensorforger/FluxRT

Under the hood, it uses a custom spatial-aware KV-cache, so it only recomputes a small number of image tokens per frame, specifically where something is moving or changing.
It also uses frame interpolation with the RIFE model, which can multiply FPS by a factor of 2, 4, 8, etc. I have found that 4 is the most appropriate for my setup.

Depending on scene dynamics, the output stream achieves up to 50 FPS in mostly static scenes and around 20 FPS when the entire input image is changing rapidly. Benchmark results are in the repo.

There is also a Gradio demo, several minimal cv2 examples, and a simple paint-style app with real-time canvas updates.

EDIT: Thanks a lot for support! Added int8 quantization mode, so it would now run smoothly on RTX 4090 too with 20 GB VRAM in peak.

u/TensorForger — 11 hours ago

is touchdesigner blowing up

Suddenly I'm seeing a million posts on IG with [moody-while-being-cutie gen Zs + gen Alphas] using mediapipe to control DAWs and interact with point clouds and such. And they have huuuuge #s. Is this touchdesigner's moment???

I'm comparing this to the last three years where it was the same dedicated group posting tutorials on youtube. and I just checked YT again and it isn't showing the same activity. so maybe IG is where it's at? I also am not on tiktok so I have no idea what's going on there 👀

P.S. this feels like a synesthesial moment for culture so I'm happy with it, just wondering what the unlock is, bc it's not like TD got easier (I actually immediately searched for MCP and AI in the /r just to see if there was some development there)

I remember a half year ago trying to get AI to help me build something in TD and AI was hallucinating left and right over components and their actual functions 😂😵

reddit.com
u/nuclearbliss — 15 hours ago

Knitted statue of liberty puts ripping off tutorials to a new level

It's great that Chladni Cymatics are becoming a thing again. Still getting emails about it on a weekly basis, which is great, always willing to help and curious how it will look in the field.

But this guy, wow, the amount of similarities, uncanny. Fun thing that I notice certain coding, which is clearly directly copied, I would do a lot different now, which tells someone can't clearly think for themselves.

It would have been fine if a small note would have said 'got inspired by this tut (link) but just trying if we can look at it from a different perspective'. That would have made a difference of night and day. But no way, just claiming to reinvent the wheel.

It becomes a thing more and more, very sad...

u/factorysettings_net — 1 day ago

Reaction Diffusion + Slope Distortion

I made this about a year ago - it was and has since been my favorite creation yet. So many happy accidents resulting in bonus bubbles and liquid effects.

This was featured as a part of Input: Output at Stove Works Galley in Chattanooga TN last summer. I’ve been working in TD for about two years now - it’s been the most fulfilling software to learn. I’m trying to be better about sharing my work, so here we are 🖤

u/thegloriousoob — 2 days ago

I don't know how original this is, but i'm pretty happy with the results :)

Tried to make my own version of Unknown Pleasures on TD

It was fun

u/allpunks — 1 day ago
▲ 409 r/TouchDesigner+11 crossposts

PaperStrip_FX is a TouchDesigner COMP that turns frame history into strip-based slices: it feels like a scan/photocopy pass where time gets cut into paper bands, then reassembled with motion-reactive stepping, drift, and print artifacts. Inspired in Oi Va Voi’s Everytime music video: https://www.youtube.com/watch?v=KQhbuBBqvRY

he video, on the other hand, it’s done through Uisato Studio’s Music Video mode, our flagship AI orchestration workflow, available now [exclusively for the TouchDesigner community].

Project files available through the Tools Store: https://uisato.studio/tools

u/TasTepeler — 7 days ago

Using real-world datasets instead of noise in TouchDesigner / POPs? Simulating live data streams for generative systems

has anyone here replaced noise with real datasets in touchdesigner?

i’m building a generative system using json news/social data (sentiment, velocity, timestamps, topics, etc) and trying to use the actual data as the force/behavior system instead of procedural noise.

thinking about creating a pseudo realtime stream from ~2k rows of data that continuously updates over time.

using pops/topology/proximity/feedback systems rn. curious if anyone’s done something similar or has ideas for best practices/workflows/tutorials/artists to study

reddit.com
u/HappyCod4402 — 4 hours ago
▲ 132 r/TouchDesigner+1 crossposts

Is it time for TouchDesigner ?

Hey everybody, it's been 2 years now that I am doing vjing, and I start to wonder if I should switch to Touch designer.

I've been experiencing with resolume and analog gears only, and I think I reach a bit my limit with what I can do. However, TouchDesigner is SCARY as hell, and it would feel like going backward in terms of progress, at least for a while.

Is it a must ? It feels like there is nothing better when it comes to create visuals from the scratch... I've tried Resolume Wire for a bit, but honestly, it was a lot of money spent for not amazing result, the software is really dry to handle!

What would be the first approach I should have ? Any tips to start incorporating it into my composition ? The possibilities seem so big that it's scary.

Track : Josef Tumari & Oxidecin - Ikonar

u/Myreil — 6 days ago

Reactive Cymatics

Had a lot fun with this, i'll leave the tutorial link for the chladni cymatic visual in the comments as well as the song link

u/Burrows_Production — 6 days ago

TouchDesigner newbie dreaming of live visuals

Hi friends!

I have been learning TouchDesigner for a few weeks in my free time, and I am really excited about the process so far. I have found a lot of great tutorials on YouTube and some inspiring books that are helping me understand what I want to create.

I am mostly inspired by nightlife, concert visuals, live visuals, and generative art. My goal is to do a small live set in my living room within the next 3–6 months, just as a personal experiment. My background is in visual design, and I have always felt drawn to creative coding and digital artists.

Right now, I can follow tutorials and make things, but I still do not fully “think in TouchDesigner” yet. I do not naturally come up with ideas and know how to solve them inside the software. How long did it take you to get to that point?

Some other questions I have:

  • At what point do you think most people quit or get discouraged?
  • Is it ever “too late” to get good at this?
  • I haven't seen any of this, but how would you think AI could impact live and interactive experiences?

From my side, I am not very strong in 3D, so I actually feel excited about AI making 3D more accessible. It makes me feel like I can start thinking about bigger projects.

I am really excited to start a conversation, share references, learn from others, and hopefully make some nerdy creative friends here :)

reddit.com