Doming Along

Doming Along
lighting_20140522_003
Lighting rig test.

It’s a good time for me to do a quick technical update on the dome build, now that I have some (temporary) space to fully setup and test the structure. Finally, I’ve been able to layout all 15 lighting heads (RGB par 36’s) and connect them up with some custom XLR lead for DMX control. I mention this last point, because I have a custom lead fetish and it’s been too long since I had the opportunity to put some together. It’s weird, but that’s what happens after you’ve spent hard time in an audiovisual hire job.

Knowing that at some stage I would be controlling the lights from a pixel-based system, I plotted each to fit within a grid as closely as possible. Each panel of the dome has 3 lights (except the doorway panel), with 1 at the top and 2 at the base. To begin with at least, I imagine that it will be helpful to be able to treat the top lights differently (with the top of the dome having a smaller diameter than the base) and so I kept each to an individual grid square…

lightingPlan_20140331
Lighting plan around dome.

Following my previous tests with the dome material, I knew that hot spots from the lights were going to be a problem. These were even more pronounced on the inside of the dome, as the frosted material refracted the light in a strange way, creating a sensation that the lights were panning as you physically move inside the space.

To combat this, I cut prismatic acrylic (that diamond-shaped stuff you see on most fluorescent office lighting) covers for each of the lights, which acted as a diffusing material. Although I lost some of the brightness from each light, the tradeoff was worth it, with a much more even and broad projection of each lighting head…

lighting_20140522_001
Prismatic acrylic covers to diffuse light.
lighting_20140522_004
Red and yellow and pink and green…

DMX was originally being handled directly from Max (here is a copy of the last patch I was working on if you’re so inclined), via an ENTTEC DMXUSBPRO, which controlled the lighting rig nicely. However, once I introduced Ableton running synced audio, my entry-level MacBook started to get really upset at me, and the fan tried to escape from beneath the keyboard.

Against better judgement, I decided to import my Max patch as MaxForLive devices, allowing me to run Ableton alone, rather than alongside Max, trying to send MIDI back and forth (as a side note: why doesn’t Ableton have OSC support yet??). Instantly, and unsurprisingly, my MacBook hated me even more. Crashing at every opportunity and reaching new and exciting levels of CPU usage, things didn’t go well when using MaxForLive.

The following week was spent trying to figure out what was going wrong.

Mostly this involved me starting new Max patches from scratch. This is definitely a problem with my approach to coding – I keep telling myself: if I start from a clean slate, or make my code neater, surely it will run better. Suffice to say, it was a week that doesn’t make for interesting blogging.

The end result is that automation envelopes from Ableton spit out messages faster than the DMXUSBPRO can handle. The lighting rig wouldn’t keep up with what I was seeing on screen and messages would get weirdly queued and even when I quit Ableton, I found that the lights were still following their programmed cycle. Through trial and error, I discovered that 1 message every 50 milliseconds was the sweet spot for DMX, so a simple [speedlim] object in the Max patch sorted that out right away – instantly, my CPU usage dropped by 75-80%. Win.

lightingTest_20140508_002
Max rig, testing DMX lighting.

And so now I’m finally (finally!) in a position to start making some content. Part of this study will be to create a kind of audiovisual storyline to give some direction to the experience. Rather than being completely interactive (or an entirely passive experience), there will be some level of narrative, to which interaction can add or redirect.

Already, I think the space is quite successful in facilitating imagination. I had a supervision meeting inside the dome yesterday, which felt very much like sitting around a campfire. The tone of the conversation was different to past meetings (though that could be my reflection upon it, rather than an objective observation) and there was an interesting movement between the surface being the focus of attention and simply a background to the conversation.

The interface for interacting with the dome will be vital in helping the autistic children make a connection between their actions and the response of the space. I have a few LightBlue Beans in the post at the moment, to build wooden block-like objects for control of the space, but another option is always the trusty USB camera.

One of the joys of working with DIY tech is that Apple (and it is mostly Apple) like to discontinue support for all manner of things every couple of years. So when you return to something that worked previously… it usually doesn’t anymore. This time around it was support for the PS3 EyeCam that I had much success with in the past. With the helpful suggestion of using a Microsoft LifeCam, I got up and running relatively quickly and whipped the infrared filter out of that thing…

As nice as it is to watch a lighting hue cycle on a dome tent, the next stage is to reduce the sensory experience. To look at what is the bare minimum needed to create a suggestion of place or a platform for imagination. It will be the decision of the children within the space to interact (or not) and introduce further response from the system and space.



3 thoughts on “Doming Along”

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.