It’s been far too long since the last post (exams looming large over my otherwise peaceful existence), and given that my fellow SoCers have managed it I think it must be time for a status update. Firstly (and perhaps most importantly) we have sound! The screenshot below shows pygame drawing a bouncy XY pad, all created as a Kamaelia component. This is then outputting OSC messages which are read by PD and used to trigger and modulate percussive noise.
If you look closely at the PD patch you can begin to get an idea of how easy it is to work with OSC messages – the OSCroute objects make it simple to split up the different messages, giving you complete control over what is received and where it is used. So what does it sound like? Well, kinda chaotic and noisy and full of goodness – listen here.
Changing tack slightly, over at Create Digital Music there is an interesting interview with the guys from Future Audio Workshop talking about their rather lovely looking new synth Circle. There’s a lot of interesting stuff about interface design, and also a bit about its use of OSC. I think Gavin Burke, FAW’s co-founder, sums up the difference between OSC and MIDI nicely:
You’re not trying to think about MIDI and what MIDI channels you’re on and all that kind of stuff; it just keeps it nice and simple. [OSC] is where it’s heading. It’s where most people want to be; most people don’t want to [have to do MIDI mappings].
Read the whole thing though – it’s interesting stuff.
Whilst idly procrastinating over the last couple of days I’ve come across two thing which are seperately very cool, and together even cooler. I’ve been keeping track of the developments in homemade multitouch stuff for a while (fueled by the shocking awesomeness that is reactable), but always though it all looked like a bit of a hassle – uni work is restricting me to five minute hacks at a time currently. That was all until I saw this video on instructables – literally a five minute ~£15 multitouch box.
Looks great huh? It gets better – the box uses Touchlib, a c++ library for doing multitouch detection. Touchlib does all manner of clever things (which I vaguely understand), but most interestingly it will now output TUIO messages. TUIO is a subset of everyone’s favourite control protocol OSC (that post saying how awesome OSC is will come very soon). It is literally a matter of running an executable and you have a webcam->TUIO system up and running. This lets you get x,y positions of each object, orientation data, and other fancy stuff with no hassle.
So why, as a very bad c++ programmer, am I so excited about this? Well the second thing I found was this, a python library for interpreting the super-simple to create TUIO messages. Now I’m no expert, but the sample code for pyTUIO looks very much to me like a Kamaelia component waiting to happen. Stick in in a class, put in a yield statement and there you have it. This is Webcam->Multitouch Interface->TUIO->Python->Kamaelia in very simple steps. So simple I’m off to ebay to buy a webcam! Hopefully you will see this working as a small aside from my Kamaelia Jam work, and see a multitouch box being used to make music with Jam!
Following much debate (the majority of which was significantly silly – I’m sure one day I’ll name an app Suffer Little Children, but that day is not today!) I’m pleased to announce that my SoC project has gone from having a lot of works in it’s name to just two.
And the winner is… Kamaelia Jam
Just to give the whole naming scheme a whole new level of irritating quirkyness please prepare yourselves for the first release sometime over the summer: codename Apricot.
Edit: Oops – stuck the wrong link in up there… Sorry!
OK, this has been promised for a while now – here’s the official “this is my GSoC project” post.
I’ve been accepted by the lovely folks at BBC R&D, and more specifically Kamaelia, to work on making a sequencer of sorts which can be edited in real time by lots of people in lots of different places. I say of sorts because I’m not currently sure quite how much sequencer capability it will have – in my mind it’s more of a hybrid sequencer/live performance tool, but this may well change over the course of the summer. Anyway, they say a picture is worth 1000 words, so here’s the UI mockup which I included with my proposal:
A couple of things you might notice from this:
- All those forward slashes everywhere – They are (or will be) Open Sound Control (OSC) messages. OSC is officially awesome, and is looking likely to become the future of audio messaging, being both more powerful, more flexible and less insane (0x90 anyone?) than MIDI. And monome uses it so it’s gotta be good right? That said I’ll also be providing MIDI output too so don’t ditch that host/hardware just yet…
- All those dots everywhere – These indicate where abouts in Kamaelia’s rather extensive jumble of components some of my bits will fit. Starting to work with Kamaelia has reminded me a lot of working in Bidule – you make the components and connect them up any way you see fit. This makes it awesome (or so I hope – it looks it) for prototyping stuff. Want your new brilliant piano roll to fling data via OSC, whilst telling you what it’s up to? Hook it up with the OSC component and a ConsoleEchoer and you’re away. This is good!
- In the mockup every user can control the step sequencer and piano roll in the middle, but only single users have control over each of the X-Y pads. This is to add a nice bit of chaos to proceedings – music would be dull if you knew what was going to happen all the time. Also hopefully this will be a simple matter of switching a public/private boolean – fancy!
So why is all this generally awesome? Lots of reasons. Firstly almost all electronic music you see is made by one person standing at a laptop (Kraftwork excepted – but I reckon they could get by without three of them…). Whilst this is all well and good part of the fun of making music is the group activity, and playing off of each others ideas. This is one of the things which hasn’t translated so well to electronic music. Hopefully by creating new tools to make music collaboratively it will bring a new dimension to the fine art of standing in front of a PC twiddling knobs. Also hopefully Kraftwerk will use it – Bug #1 methinks?
Other awesome stuff? Well, there’s not much point having a great tool for collaboration if the results aren’t interesting. For me the really cool thing about this is that because everything works via MIDI or OSC then every user will potentially hear something different, depending on how they have their instruments and effects set up. Ever better than that, eventually it will all be peer-based. Each user will be able to select which of their peers to connect to, resulting in different music being played depending on who the user has been connected to at what times. With enough users in the network everyone will be listening to their own unique performance of what is effectively the same piece. Cool huh?
So finally I should probably give a few shouts (in true Westwood style) to my lovely fellow SoCers, the Kamaelia crew, and especially Sylvain for being foolhardy enough to put his name down as my mentor. Oh, and here’s a video of some of my inspiration for this project, because videos at the end is how I roll…
Hello, and welcome to the first post of my lovely spangley new blog. This has been mainly set up to track my progress on the Google Summer of Code (more on that when I have time). Also hopefully it’ll become a bit of a braindump for cool stuff which I’m up to, and incompetent musings on things which have been taking up much needed space in my head.
For now I’m going to get something to eat, and will leave you with this video of the rather incredible Absolut Quartet. A nice start to some new scribblings methinks.