Professional Portfolio – Week 02
After presenting last week‘s work in class, I received some valuable criticism on the direction of my work. Not levelled at the function or technology of my project, but the content I’m choosing to present. It was suggested that rather than focus on the exhibition inside of CarriageWorks, I look at the history of the building and area around CarriageWorks, from a long-term perspective.
This idea of ‘long-term’ was suggested through the lens of archaeology, or chronology, which started me thinking about how the audience might interact with my work. Removal of the interface is something I was initially very keen on, but the thought of someone ‘digging’ through the history of an area definitely has it’s own appeal.
Content aside, the above image shows a mockup of how the digging metaphor might take place. Using dirt or similar on top of a table, the audience can literally dig through a mini-sandpit. This action would expose an infrared camera below to greater amounts of light, telling the computer how much (and also possibly where) the audience has been digging. This change in light would result in projected bricks being removed and data sets taking their place.
Finding long-term data sets has been less fruitful. There are some great data mapping repositories out there, but what I’m after is raw data to base these maps upon. Something I came across last year as part of the Australian Government 2.0 initiative was mashupaustralia. Although this competition has now closed (with some excellent winners), the data.australia site remains. Even if it’s a slightly cynical expression of Government openness, having public access to a broad cross-section of data is hugely helpful for any number of reasons.
However, this data all appears to be relatively recent. What I’m searching for is a more historical look at the area surrounding CarriageWorks. With such a rich Aboriginal and working class history in the area, it would be short-sighted not to include data that takes these demographics into account. Unfortunately, from what I’ve found, data becomes ‘hearsay’ once records get back to the early 1900’s. A quick look at some of the earlier records on Australian Bureau of Statistics makes this pretty clear.
That said, the data I ultimately choose for this project is not required for me to begin work with the programming involved. For the most part of this week, my MacBook was still in repair, making work slow going.
Because Quartz Composer is strictly an Apple program, I worked primarily with OpenGL in Max/MSP again this week. Whilst it’s a new language to me (and therefore difficult at times), it may prove to be a blessing in disguise, as this language appears to put a lot less strain on the CPU and the resulting image in smoother. Of course, working with something that is so far out of my own headspace presents its own set of problems.
Largest of these for me at the moment, is the difficulty of creating shadow casting. Working with Quartz Composer in first semester made this a breeze, as it has it’s own in-built lighting module. Unfortunately Max/MSP does not. As the coloured bricks above show, whilst 3D shapes can be ‘lit’ in their own right, this lighting is not global – it does not cast shadows from one object to another. Why this happens is totally outside the scope of my own work, and as others have posted on the Cycling 74 forums, it’s not an easy task to create this illusion within the OpenGL objects provided in Jitter.
Although I (finally) got my MacBook back from repair yesterday, I’m cautious about taking the easy route of jumping back into Quartz, at the expense of performance. This project is likely to work with far more data than I have previously, so there’s every chance the visuals will suffer as a result of a choking CPU/GPU. For the time being, I’ll continue working with OpenGL.
As you can see from the above video, OpenGL objects animate pretty smoothly. Of course, there’s nothing too complex going on there, and it should be noted that the grey bricks come from a Blender obj file, rather than being created by OpenGL itself. I’ve also started using a process that I took from this excellent Jitter tutorial by Darwin Grosse, which allows me to enter OpenGL code within a text file, rather than ever-growing message boxes in Max/MSP.