Digital Video, DIY style

I’ve already mentioned in previous posts that I’ve been working on a music visualiser application based on Digital Signal Processing (DSP) of sound, which could then be used to project images to screen during performances of my music.  It’s a significant detour from writing music itself, but would be particularly valuable for when I am performing instrumental pieces, to add interest to the listener experience. 

Unfortunately, I have a few challenges to overcome before my work so far can be used to animate live music. Namely, I can currently only use the visualiser on pre-recorded music, which obviously isn’t any good for live work, and it is only coping with small files at this stage. So, I need to learn how to get it to accept streaming audio, and figure out how to get a live sound signal into it.  If indeed that is possible.

I discovered another potential use for my work today, however, and that is to use the applications I’m writing to generate video art. It turns out that this is pretty easy to do by recording the app running on my screen with Quicktime then trimming it in video editing software.  The most difficult thing seems to be getting the sound and images to line up correctly where they are supposed to be synchronised, because recording the app running doesn’t capture the sound (unless that was a mistake on my part… I must check out if I missed any settings). I’ve probably not been completely accurate syncing up the attached example, but it’s close enough this time.

Leave a Reply

Your email address will not be published.