I had one of the greatest wins of my lifetime arrive this week in the form of a successful HREC application. Now that this giant hurdle is out of the way, I finally get to focus on putting together the observational studies that form the centrepiece of my PhD; and only 3/4 of the way through my study.
Something that hasn’t had a lot of attention thus far, is the interface for controlling the audio and lighting for the Interactive Multi-Sensory Environment (IMSE) (a note on that long winded title: that’s how I’m referring to the structure in my academic writing, seeing as “dome tent” sounds a little amateurish). For some time, I had been planning on creating a set of toy-like wooden blocks as the physical interface – mostly for their familiarity as a object for children and to avoid a potentially lengthy period of having to ‘teach’ each child how to use something bespoke.
Through their Kickstarter campaign in 2013, I’d purchased 4 of the LightBlue Beans; a Bluetooth low energy micro controller, based on the Arduino and – relevant for my purposes – including an on-board accelerometer. These little boards have loads of potential, but frustratingly they’re not entirely ready for primetime. Without going into too much detail, my biggest problem with the Bean right now is that it’s difficult to get data from the board into anything outside of the Arduino IDE, or the LightBlue apps. It would require a few stages of hoop-jumping to actually use the data where I want it: inside Max.
Nevertheless, development seems to be happening quickly for the Bean, so I’m remaining hopeful that this will get sorted out soon and I’ve started prototyping the blocks which will house the Beans. First, things started out very simple:
Through testing, I found that the coin batteries run out fairly quickly. In real terms, the battery life for the Bean is actually good, but I’m lazy and don’t remove the battery whilst they’re not ‘in use’ (actually, being a Bluetooth low energy device, they’re always doing a brief connection check), so I’m constantly replacing them. One solution is to attach a AA battery pack, and so I whipped up an extremely simple 3D print to house both the Bean and batteries.
Whilst I’m hopeful that the lack of support for platforms outside Arduino and LightBlue apps will be fixed soon, I get burnt by new technologies on a regular enough basis, that I know it’s best to have a fallback plan. For this project, I’m going to take another look at my old friend, the fiducial marker.
A little bit like a barcode, each fiducial has a unique id. This makes them useful for tracking position and simple physical events, as can be seen in their best-known usage (at least in music-tech circles), the Reactable. Like the accelerometer on the Bean, by using a camera to track their position, I can know the orientation of each box. This will let me do simple actions, like changing lighting colour to reflect the chosen colour of the box.
The above video shows that the current colours aren’t all completely successful. The darker colours lack contrast against the dark fiducial and the camera isn’t recognising them. Alternatively, I may try light fiducials, or only use light colours to improve contrast. Still, this is the basic interaction mechanic (to begin with): colour of block equals colour of light. The first stage of the study will be simply for the children to make a clear connection between their action and the response of the IMSE.