u/TasTepeler

▲ 189 r/artandcode+8 crossposts

[Release] LongExposureFX COMP | An experimental temporal ghosting toolkit

An experimental temporal ghosting / long-exposure toolkit for TouchDesigner, built for turning prerecorded and real-time footage into smeared, split-exposure, echo-like motion.

The system layers delayed frames, masks the active subject region, and adds optional feedback persistence to generate distorted portrait, face, and full-body trails that sit somewhere between long exposure, temporal rupture, and spectral motion blur.

This release also includes:

 a custom FLUX-2 LoRA trained on experimental photography [the one used in this demonstration]
 the pertinent ComfyUI workflow for FLUX-2.dev + LoRA text-to-image generation

Available now through my Tools Store.

Both music and visuals by myself, deeply inspired by the recent BoC-related events.

u/TasTepeler — 6 hours ago
▲ 87 r/HybridProduction+10 crossposts

I created an agentic orchestration pipeline for music video generation - [More info in comments]

I’ve been building Uisato Studio, a workflow-based AI creation platform for audiovisual work.

This is the Music Video mode: upload an image + audio, and the system analyzes the input, generates visual direction, creates clips, handles b-roll / lip-sync when needed, and assembles everything into a finished music video through a guided pipeline.

I’m trying to move AI video from isolated generation into orchestration; an agentic production system built for more coherent, edit-ready audiovisual output.

I’ve been building this suite for the past year, hope you guys enjoy it: https://uisato.studio/

u/TasTepeler — 1 day ago
▲ 409 r/artandcode+11 crossposts

PaperStrip_FX is a TouchDesigner COMP that turns frame history into strip-based slices: it feels like a scan/photocopy pass where time gets cut into paper bands, then reassembled with motion-reactive stepping, drift, and print artifacts. Inspired in Oi Va Voi’s Everytime music video: https://www.youtube.com/watch?v=KQhbuBBqvRY

he video, on the other hand, it’s done through Uisato Studio’s Music Video mode, our flagship AI orchestration workflow, available now [exclusively for the TouchDesigner community].

Project files available through the Tools Store: https://uisato.studio/tools

u/TasTepeler — 6 days ago
▲ 204 r/artandcode+6 crossposts

A∴V∴P / SYSTEM_Δ

This is a TouchDesigner video player whose playback position responds to incoming audio in real-time.

Available in four different formats, through the recently released tools page.

More experiments through my YouTube, or Instagram.

u/TasTepeler — 8 days ago

Audio-reactive geometry TouchDesigner + AE patch I made some time ago. Hope you guys enjoy it!

If you're curious about my experiments, you can watch more [and even access its project files] through my YouTubeInstagram, or Tools Store.

u/TasTepeler — 10 days ago
▲ 136 r/artandcode+5 crossposts

Audio-reactive geometry TouchDesigner + AE patch I made some time ago. Hope you guys enjoy it!

If you're curious about my experiments, you can watch more [and even access its project files] through my YouTubeInstagram, or Tools Store.

u/TasTepeler — 10 days ago
▲ 148 r/artandcode+6 crossposts

Open-sourcing kinect-controlled instrument!

If you have a Kinect camera lying around, I've got good news for you:

This system [Kinect → TouchDesigner → Ableton Live] turns gestures into MIDI in real-time, so Ableton can treat your hands (or any part of you body) like a MIDI controller; trigger notes, move filters, or drive any VST parameter you'd like to. [Just updated it for a bit more clarity and less setup difficulty!]

As of today, can freely access it through the Tools Store, and/or a full tutorial at through my YouTube channel.

I've ran several tests through the years, controlling drums, guitar fx chains, and even a complex stack of Harmony Engine, you can see/hear a bunch of these in this post.

Hope you folks enjoy it deeply.

u/TasTepeler — 11 days ago