For a performance at Algorithmic Art Assembly on March 11, 2022, I recorded a video improvisation in TouchDesigner which was projected while I performed with a custom music patch I made in Plogue Bidule. There were two main components to the video - a set of found footage screencaptures which I could load in realtime, and subpatches I could switch between which each performed a different type of processing or 3D distortion/displacement. I recorded the clips through my various effects for about 45 minutes total, with occasional pause-button editing for drastic changes. I could also chain the effects/processes through each other in series, and occasionally repatched them that way in realtime.
For more info on Algorithmic Art Assembly, check out: https://aaassembly.org/