Digital Studio – Week 05

Digital Studio – Week 05
Room for one colour (1997) by Olafur Eliasson. Image: Jens Ziehe

I finally made it along to the Olafur Eliasson exhibit at Sydney’s MCA this week. If you haven’t… well, it’s probably finished by the time you read this.

It was good. Actually, it was pretty great, but tempered by the fact that everyone had been telling me just how great it was for the past couple of months. Like Where The Wild Things Are and Alice In Wonderland (aka Alice In Wonderland: An IMAX 3D Experience), it was always going to struggle living up to the hype. I guess that’s the trouble when something is created to be accessible to all: suddenly everyone is a critic. Hat-off to Eliasson then that most of the critique was overwhelmingly positive.

Possibly reflecting the state of my own work process, what interested me most of all were the more basic, experiment-like pieces. Unlike many of the ‘high-brow’ art critics, I think this work is indeed as valuable and worthy as the final outcome.

Remagine (2002) by Olafur Eliasson

The one installation that grabbed my attention was probably the simplest on show: Remagine (2002). A series of 5 spotlights fitted with different shaped gobos, cleverly gave the sense of changing perspectives within the room. This play with light is in essence the direction my work is taking at the moment. Regardless of whether it comes from a globe or a projector, the premise is still the same.

Using Eliasson’s piece as the starting point, I have begun to put together simple Processing sketches of perspective-related geometry. The difference with these is that they will be interactive. The following use mouse input as a control, but there’s no reason that tracking the movement of people couldn’t have the same effect. The first is very basic…

[processing file=”/wp-content/uploads/2010/04/testExperiment02_20100405.jar”]

Source code: testExperiment02_20100405

I wasn’t happy with the flatness in this original sketch, so tried adding gradient fills to the shapes. For anyone interested in doing the same, you’ll need to use the OpenGL library to get this effect (see the source code for a clear explanation). I’ve noticed a few issues with the rendering of the following sketch in browsers – if you’re having trouble, click here for a more reliable version…

[processing file=”/wp-content/uploads/2010/04/testExperiment02_20100406.jar”]

Source code: testExperiment02_20100406

OpenCV is likely to be a large element of the way I track movement, whether I use Processing, Max/MSP, or another language. Created by Intel, it takes care of a bunch of low-level procedures, to make things like motion tracking and face detection relatively easy. If none of this is making any sense, there is a great tutorial by Andy Best over at CDMu to get you started with OpenCV in Processing.

What I need to start thinking about now is the way I want to test human interaction with my work. If I’m using motion tracking and video cameras, OpenCV in Processing or Max/MSP looks to be the way to go. However, if I’m to use a physical interface, or build some kind of sensor input into the work, a closer look at Arduino might be in order.

Either way, I’ve hit the book store once again and picked myself up a copy of Physical Computing by Dan O’Sullivan and Tom Igoe. This book not only looks at programming, but building microcontroller-based interaction. Importantly though, it breaks things down to basic elements, which could help me get through what I think might be a Easter chocolate-related brain freeze.



Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.