After some more headbutting and reaching a point where I didn’t think I would be able to solve the original problem of routing live sound through a self-programmed music visualiser, I went back to basics. Ditching the sound module provided on the Creative Programming course, I looked into Processing’s own sound library using the online documentation.
And bingo, using the available sample code in the online tutorial, I suddenly had something that was responding to input from the soundcard. Just like that. The graphics were terrible – just a fuzzy line at the bottom of the screen, but the body was still twitching, so to speak.
So, moving on from there, I’ve incorporated the relevant commands into the visualiser code, and developed the graphics further to create something workable. The short video here is just a teaser: I want to keep the full graphics for live shows. Here, output from Ableton Live Lite is being picked up by the visualiser from the signal going through the soundcard, and processed on the fly.