For recent performances at music events and on live streams, I've used a video processing setup built in TouchDesigner. There are two main components - a set of found footage screen recordings which I can load in realtime, and subpatches I switch between which each perform a different type of processing or 3D distortion/displacement. I can also chain the effects/processes through each other in series, or repatch them in realtime. Some performances have been entirely realtime, and others have been prerecorded, using pause-button editing for more drastic changes. All music in the videos is by me, under my Trash Panda QC alias. Playlists of additional sets can be found collected on my YouTube channel: https://www.youtube.com/channel/UC8mvECRIgnXyAurYeojbWwg