I think my girlfriend secretly hates me. She’s currently head-down in books, doing her final/postgrad/super-diploma/masters/accreditation/thing to be a psychologist. Meanwhile, I’m playing with toys. Literally. And this is what I get to pass off as my final year of study. I’m so glad I didn’t listen to my parents.
Because we’ve just returned from a couple of weeks break, I haven’t had a chance to test any projects in situ. Instead, I’ve been trying out a bunch of different options for the hardware/software combination I will use in my final work. The main technologies I’ve taken a look at are the excellent DMaX (MaxForLive DMX lighting patch), cv.jit (OpenCV external for MaxForLive), Quartz Composer and OSC.
Unfortunately, most of these technologies make my project a little more locked-down, unlike the Processing examples I have been using in the past – MaxForLive is commercial software and Quartz Composer comes bundled with OS X. This is isn’t a decision I made lightly, because it goes against my belief that I should make my work as available to others as possible. However, my limited knowledge of text-based coding made working with Processing for a project like this untenable. Hopefully some day, I can present the code in a more open format.
I used Ableton Live to control lighting for a project last year. The difference between then and now is the addition of MaxForLive. Probably because of the lack of economy in my coding, Processing (via OSC) took signals from Live and OSCulator to change brightness of some cheap 150W floodlights – very unreliably. Max/MSP (via DMaX) is snappy and relatively simple to work with. The external hardware setup hasn’t changed between projects…
Now that Max/MSP lives handily within Live, I decided to start simple: by using MIDI notes to trigger lights. DMaX chewed this up and spat it out without any troubles, and although this video is pretty terrible, it goes to show just how much better response is with the MaxForLive setup, compared to what I had running in Processing (again, I have little doubt that this was due my poor coding)…
This was fine, but it’s unlikely that I’ll be getting an audience to lay down some funky beats for my final work, so I needed to figure out a way to get Max/MSP responding to movement. Being aware of OpenCV from using Processing, I came across Jean-Marc Pelletier‘s excellent Max/MSP/Jitter external set, cv.jit.
cv.jit objects take care of some of the dirty behind the scenes work of computer vision, by providing access to things like frame differencing and blob detection. However, because I’m still learning to come to grips with Max/MSP, it wasn’t quite as easy as dragging and dropping these objects into my patch. Fortunately, a wonderful tutor by the name of Liubo Borissov at the Pratt Institute in New York has been putting videos of his class online. The videos aren’t well documented in their descriptions, but I couldn’t have got to this point without going through a few of them…
It was around this time that I received the suggestion that I work with Quartz Composer for visuals, rather that Processing. Quartz Composer is similar to Max/MSP in that it is a graphic programming language and being based in OS X’s impressive graphics capabilities, it can do some pretty amazing looking stuff. In the same way that I can send messages between Max/MSP and Processing, Quartz Composer can send/receive messages via the OSC protocol. I had this up and running in no time, with the hexler qcOSC plug-in.
There are a couple of benefits to either of the above OSC methods. The main one being that OSC is a networking-style protocol and this means that I can separate the graphics generation and the interaction data manipulation across different computers, lessening their workload. Using these programs in this way also means really taking advantage of what they excel at: Max/MSP in raw data manipulation and Quartz Composer or Processing at visualisation.
Like Professional Practice, I needed to present my work thus far in class this week. Whilst these experiments are interesting for me, they don’t really explain what I am trying to achieve as a final project, so I created some mockup examples (warning, large file sizes ahead)…
Marrickville Council and in particular their Arts and Cultural Development Coordinator, Caroline McLeod, has been a great help so far. The biggest potential sticking point for doing a public work like this is likely to be Council approval. If the work meets particular criteria, it may even need to go through the Development Application (DA) process, which is exactly the same process as if I were building a house. If my work ticks two of the following points, I’ll need a DA…
- needs a traffic management plan which changes conditions on a public road;
- runs over more than one date (i.e. extends over more than one day);
- has 5 or more stalls selling food or beverages;
- has 5 or more stalls selling other goods;
- expects a public participation of 1,000 or more people during the event;
- has amplified entertainment or video/cinema projection;
- charges an entry fee on public land; or
- any other event that Council deems should be subject to a DA.
Fortunately, McLeod wants to work with me to try and make sure I’m not ticking these boxes and therefore don’t need to go through this potentially expensive and slow process.
It will help my cause if I begin working with Council as soon as possible, so the next step is to find a location in the Marrickville area and get approval. McLeod has suggested a couple of options, which I’ve used for this week’s presentation (file is sans animations). However, I would like to spend a day or so scouting for something that I feel really comfortable with. Just need to wait on my new camera to arrive…