Code Like A Girl

The last week has been a bit bonkers, as I’ve been trying to fit work around helping a good friend with a family crisis. I haven’t got as much done as I would have liked, but some things are just more important, and I’ve still been able to make a little progress.

Over last weekend, I wrote an article for an online publication called Code Like A Girl, which you may possibly have heard of if you’re interested in encouraging more girls to do STEM subjects, or you’re a bit of a computer geek. During the week, there has been communication with the editorial team, minor changes made, and they have published it this evening.  The article is a bit of a mashup of a couple of the articles in this blog, and describes how I made the video for Sunrise but I thought I’d share it here anyway as I’m feeling somewhat chuffed to have been included on their writing team and had an article published about my work.  I anticipate writing at least another article about a different video I’ve made with a similar technique, maybe going into a little more depth about the code this time.

Vorsprung Durch Technik

After some more headbutting and reaching a point where I didn’t think I would be able to solve the original problem of routing live sound through a self-programmed music visualiser, I went back to basics. Ditching the sound module provided on the Creative Programming course, I looked into Processing’s own sound library using the online documentation.

And bingo, using the available sample code in the online tutorial, I suddenly had something that was responding to input from the soundcard. Just like that. The graphics were terrible – just a fuzzy line at the bottom of the screen, but the body was still twitching, so to speak.

So, moving on from there, I’ve incorporated the relevant commands into the visualiser code, and developed the graphics further to create something workable. The short video here is just a teaser: I want to keep the full graphics for live shows.  Here, output from Ableton Live Lite is being picked up by the visualiser from the signal going through the soundcard, and processed on the fly.