Interactive granular texture synthesiser
An interactive granular texture synthesiser built in TouchDesigner, utilising real time signal processing of the domain-specific language Faust (functional audiostream), and the corresponding TD_Faust operator. I am using Apple’s ARKit to send FaceID data as OSC signals via the UDP protocol into TD, as well as mediapipe for the hand tracking. The hand tracking controls the effects parameters of Faust just like an XY-pad, only with an additional Z axis which tracks how close your hand is, visualised by the circle shrinking or growing. That one is mapped to the filter cutoff of the reverb, which is also implemented into the FM synth and can be downloaded as a VST3 as well as a component plugin (MacOS only).
Project Demonstration
Using mediapipe for the hand tracking and the ZigSim app to send OSCĀ data via the UDP protocol into TD.
Interview
An Interview with Stanford graduate David Braun, who wrote the TD_Faust operator that is used in the project.
How it works
looper = rwtable(tablesize, 0.0, recindex, _, readindex) with{ record = button("[2]Record") : int; readspeed = hslider("[0]Read Speed", 1, 0.001, 10, 0.01); tablesize = 48000; recindex = (+(1) : %(tablesize)) ~ *(record); readindex = readspeed/float(ma.ma.SR) : (+ : ma.decimal) ~ _ : *(float(tablesize)) : int; };
Parameter mapping
Mouth open was used for record, axis X and Y of the hand tracking for reverb parameters. Blink and smile for randomising what parameters were affected by X and Y. Axis Z controlled the reverb dampening (a simple lowpass filter cutoff).
def onValueChange(channel, sampleIndex, val, prev): widget = op('ui_container2/Looper/Record3') widget.par.Value0 = val return
Generalised signal chain
Due to increasing complexity of the project, I am simply showing the signal chains below, while highlighting important sections with white text.
Development
Early tests with hand tracking. The more complex visuals had to be scrapped due to issues with not enough processing power.