The last few weeks have been a bit intense at times. I signed up for a game music composition course on Udemy in June – maybe not the sanest thing to do during a 30-day composition challenge, but I doubled up and used the homework as Tune-A-Day June tunes, which helped me crack on with it.
It’s the first Udemy course I’ve studied and it is going well. I wanted to write about what I’m learning for Code Like a Girl, because it involves Unity, a very popular game development ‘engine’*. With the games industry being so buoyant it would be good to encourage more girls to get involved. But I needed to write a long-overdue article first to complete the story about making video using Processing 3 to write basic animations, so I ended up writing two articles quite close together. The first article is about making the music visualiser used in the Silver Bird video; the second is about what I’ve been up to on the game music course, which teaches you to make your music adapt to what is happening inside a game and trigger music cues based on events occuring. It’s been really interesting so far and I feel like I’m much better prepared for making more music for games.
I’ve also spent some time trying to learn a little more than is covered on Unity by the game music course by looking at Unity tutorials and just playing around with it. (If you read the second recent article for Code Like A Girl, you’ll see I had some fun body-modding the player character and giving his glasses a makeover). I definitely want to learn more about using Unity, as I’d like to create a simple game that I can use for making video footage from. This will involve learning to code in C#, however, so that will stretch the old grey matter more than a little when I get to grips with it properly. I’ve only really scratched the surface so far. (Unity can also take Java scripts, but the tutorials look like they concentrate on C#, so I’ll just follow those – the two languages seem very similar from my perspective, anyway).
Unity gives you a fantastic platform to work with for making animations, from what I can see, and you can almost treat it like a filmset once you’re proficient, setting up camera angles, getting the camera to follow a character’s movements, or to zoom in to the action, etc. I’m not sure how I’m going to fit in all this learning though. It may be quite a long time before I can make something really decent – if I get that far.
New Game – Way of the Bubble
Meanwhile, I’ve had my first glimpse of the Way of the Bubble game, the first game in the TrickJazz chillout mobile games series, which I already mentioned a few times. The game I was originally scheduled to be in, Dreamwalker, has been delayed, so they’ve included my tune Sunset Landscape as one of the tunes in Way of the Bubble. The game is now in beta testing, and I’ve had a go at playing it already, which I’m naturally quite excited about.
*I’m not sure how they started calling these things engines – Unity is a development environment where you pull together and organise all the different elements that go into making a game and test it. It has built in elements, like a ‘physics engine’ so you can apply gravity to your objects and make them bounce back from walls, etc. You can also buy additional items to extend the possibilities, particularly of the graphics elements available.
The last week has been a bit bonkers, as I’ve been trying to fit work around helping a good friend with a family crisis. I haven’t got as much done as I would have liked, but some things are just more important, and I’ve still been able to make a little progress.
Over last weekend, I wrote an article for an online publication called Code Like A Girl, which you may possibly have heard of if you’re interested in encouraging more girls to do STEM subjects, or you’re a bit of a computer geek. During the week, there has been communication with the editorial team, minor changes made, and they have published it this evening. The article is a bit of a mashup of a couple of the articles in this blog, and describes how I made the video for Sunrise but I thought I’d share it here anyway as I’m feeling somewhat chuffed to have been included on their writing team and had an article published about my work. I anticipate writing at least another article about a different video I’ve made with a similar technique, maybe going into a little more depth about the code this time.
I’ve already mentioned in previous posts that I’ve been working on a music visualiser application based on Digital Signal Processing (DSP) of sound, which could then be used to project images to screen during performances of my music. It’s a significant detour from writing music itself, but would be particularly valuable for when I am performing instrumental pieces, to add interest to the listener experience.
Unfortunately, I have a few challenges to overcome before my work so far can be used to animate live music. Namely, I can currently only use the visualiser on pre-recorded music, which obviously isn’t any good for live work, and it is only coping with small files at this stage. So, I need to learn how to get it to accept streaming audio, and figure out how to get a live sound signal into it. If indeed that is possible.
I discovered another potential use for my work today, however, and that is to use the applications I’m writing to generate video art. It turns out that this is pretty easy to do by recording the app running on my screen with Quicktime then trimming it in video editing software. The most difficult thing seems to be getting the sound and images to line up correctly where they are supposed to be synchronised, because recording the app running doesn’t capture the sound (unless that was a mistake on my part… I must check out if I missed any settings). I’ve probably not been completely accurate syncing up the attached example, but it’s close enough this time.
A couple of weeks ago, I went to a couple of events at Lincoln’s Sonophilia Festival (the UK one, not the one in Nebraska). One of these events was the Weird Garden experimental music club, and a chap called Dave C was demonstrating Lissajous curves by generating four tones using a Raspberry Pi and an Arduino board, then plotting these on an oscilloscope, projected so we could all see it. I thought that this would interest my Dad – he’s been known to experiment with a Raspberry Pi – and sent him some photos of the set-up.
Well, this started a conversation and a half. It basically headed in the direction of ‘you should learn some Digital Signal Processing’, with me misunderstanding what that might entail, thinking that it would involve circuit board design and a very steep learning curve. There was a lot of talk at cross-purposes, but Dad eventually explained I would need to learn some new programming skills, in a language called Visual DSP. The boards are already designed, so I wouldn’t need to worry about that, and you can do DSP on your computer, anyway, because computers already have the physical tools needed to do DSP.
I’m still working my way through week 1 of the course, but the images above are all screenshots made from code that I put together* and then ran and interacted with to make art. I modified the code a little between each image captured.
Even if I get horribly stuck from here onward, I will have an app that I can use to make some unique album art!
*Disclaimer: my code also uses functions taken from a module coded by the course providers.