Recording Eye Contact

Recording Eye Contact

I had an informal meeting today with a neuroscientist who looks at (amongst other things) brain development in people with autism. This is the second of these types of meetings I’ve had thus far; trying to find out where my PhD research will sit in terms of science and if there are any academics from the sciences who may be able to come on board as an additional supervisor, and advise me with the structure of behavioural studies and measurement of results.

It’s that latter point that seems to be where the art and science sides of my methodology are colliding. Whilst both meetings have been very helpful – not to mention the academics being generous with their time and knowledge – each of these scientists wanted to know what my ‘results’ or ‘outcomes’ would be. Coming from a fine arts campus, this isn’t quite as black and white as it might be if I were approaching the study as a psychology student for example. There will be a mix of aesthetic, theoretical and scientific output from this study and at this stage, I have no idea what the hierarchy of all three (or more!) will be. Nevertheless, it got me thinking about what I can do with my own focus on technology to record or measure results. And so I went for a quick spin in MaxMSP to see if I can record some eye detection/tracking.

basicEyeTracking_20110123
A quick test to record eye movement (left/right) in MaxMSP.

Download Max patch.

One of the key indicators of social engagement is eye contact, and something which many of those with an Autistic Spectrum Disorder have difficulty with. And so it would be nice to see if there was eye contact happening between two people as they interact with something. Of course I could use a video recording during any case studies (and probably will for recording many other behaviours), but I thought it could be interesting to see if I can use Max to record this data for me. Once again I returned to my favorite set of Max externals, cv.jit, and found that Jean-Marc Pelletier has actually made a dedicated (but unsupported) eye tracker for Jitter.

As with the cv.jit externals, the eye tracker works without any fuss. The above patch is a direct copy of the example patch which comes with the eye tracker download, with the exception of being able to detect eyes in separate parts of the camera capture area (this is different to being able to track 2 sets of eyes in the same frame – I’ve simply split the capture area into 2  distinct areas) and try to detect whether eyes are looking right or left. The idea with this patch is that if one person is looking right and the other left, they may be looking at each other – or at least at the same thing, showing some level of engagement. My 4 years at art school came in handy for this test, as the eye tracking algorithms don’t seem fussed about real vs badly drawn images.

It’s a very basic patch and a very loose proof-of-concept idea, but one I just may pursue a little further.



Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.