The last few weeks have been a bit intense at times. I signed up for a game music composition course on Udemy in June – maybe not the sanest thing to do during a 30-day composition challenge, but I doubled up and used the homework as Tune-A-Day June tunes, which helped me crack on with it.
It’s the first Udemy course I’ve studied and it is going well. I wanted to write about what I’m learning for Code Like a Girl, because it involves Unity, a very popular game development ‘engine’*. With the games industry being so buoyant it would be good to encourage more girls to get involved. But I needed to write a long-overdue article first to complete the story about making video using Processing 3 to write basic animations, so I ended up writing two articles quite close together. The first article is about making the music visualiser used in the Silver Bird video; the second is about what I’ve been up to on the game music course, which teaches you to make your music adapt to what is happening inside a game and trigger music cues based on events occuring. It’s been really interesting so far and I feel like I’m much better prepared for making more music for games.
I’ve also spent some time trying to learn a little more than is covered on Unity by the game music course by looking at Unity tutorials and just playing around with it. (If you read the second recent article for Code Like A Girl, you’ll see I had some fun body-modding the player character and giving his glasses a makeover). I definitely want to learn more about using Unity, as I’d like to create a simple game that I can use for making video footage from. This will involve learning to code in C#, however, so that will stretch the old grey matter more than a little when I get to grips with it properly. I’ve only really scratched the surface so far. (Unity can also take Java scripts, but the tutorials look like they concentrate on C#, so I’ll just follow those – the two languages seem very similar from my perspective, anyway).
Unity gives you a fantastic platform to work with for making animations, from what I can see, and you can almost treat it like a filmset once you’re proficient, setting up camera angles, getting the camera to follow a character’s movements, or to zoom in to the action, etc. I’m not sure how I’m going to fit in all this learning though. It may be quite a long time before I can make something really decent – if I get that far.
New Game – Way of the Bubble
Meanwhile, I’ve had my first glimpse of the Way of the Bubble game, the first game in the TrickJazz chillout mobile games series, which I already mentioned a few times. The game I was originally scheduled to be in, Dreamwalker, has been delayed, so they’ve included my tune Sunset Landscape as one of the tunes in Way of the Bubble. The game is now in beta testing, and I’ve had a go at playing it already, which I’m naturally quite excited about.
*I’m not sure how they started calling these things engines – Unity is a development environment where you pull together and organise all the different elements that go into making a game and test it. It has built in elements, like a ‘physics engine’ so you can apply gravity to your objects and make them bounce back from walls, etc. You can also buy additional items to extend the possibilities, particularly of the graphics elements available.
The last week has been a bit bonkers, as I’ve been trying to fit work around helping a good friend with a family crisis. I haven’t got as much done as I would have liked, but some things are just more important, and I’ve still been able to make a little progress.
Over last weekend, I wrote an article for an online publication called Code Like A Girl, which you may possibly have heard of if you’re interested in encouraging more girls to do STEM subjects, or you’re a bit of a computer geek. During the week, there has been communication with the editorial team, minor changes made, and they have published it this evening. The article is a bit of a mashup of a couple of the articles in this blog, and describes how I made the video for Sunrise but I thought I’d share it here anyway as I’m feeling somewhat chuffed to have been included on their writing team and had an article published about my work. I anticipate writing at least another article about a different video I’ve made with a similar technique, maybe going into a little more depth about the code this time.
The last week or so, as well as working on the mixes for my first album, I decided to have a look into fractals as it seemed like a logical next step from the graphics animations I have been working on so far. I found that Wikipedia has a lot of good resources, but probably has too much detail for a beginner, if you follow all the rabbit holes it leads you down like I did. A far better introduction was a Youtube video, which brought together pretty much everything I’d read to date and then some. (Sadly it’s been taken down since I wrote this post so I can no longer share it with you).
At least some of the more visually appealing fractal patterns are constructed using maths that involves the square root of minus one, eg Mandelbrot and Julia sets. Those sets also looked as though they were not going to be easy to animate, as it looks like the whole image is calculated pixel by pixel, with the colour of each pixel set by how long it took to reach a threshold value.
There are some (more basic) fractals that were more easy to understand, however, such as the Cantor set and the Sierpinski carpet, which are made by an iterative process. The Cantor set is a set of lines with the middle section taken out. Then you rinse and repeat, taking the middle section of the new lines away for the next iteration. The Sierpinski curve does something fairly similar, but with rectangles. I could see a way through the fog for programming visuals for these types of fractals in Processing, and have incorporated some of these into a new set of visuals for displaying during my next live gig.
After some more headbutting and reaching a point where I didn’t think I would be able to solve the original problem of routing live sound through a self-programmed music visualiser, I went back to basics. Ditching the sound module provided on the Creative Programming course, I looked into Processing’s own sound library using the online documentation.
And bingo, using the available sample code in the online tutorial, I suddenly had something that was responding to input from the soundcard. Just like that. The graphics were terrible – just a fuzzy line at the bottom of the screen, but the body was still twitching, so to speak.
So, moving on from there, I’ve incorporated the relevant commands into the visualiser code, and developed the graphics further to create something workable. The short video here is just a teaser: I want to keep the full graphics for live shows. Here, output from Ableton Live Lite is being picked up by the visualiser from the signal going through the soundcard, and processed on the fly.