Awesome Update

The Awesome Foundation website. Awesomely straight to the point.

There’s been quite a lot going on during the uni break, so I thought an update might be in order. Particularly so I could publicly thank the Sydney chapter of The Awesome Foundation for giving me a massive helping hand, by way of a $1000 grant!

The Awesome Foundation started in Boston, but has quickly started popping up in cities all around the world. Briefly, each chapter consists of 10 people who meet monthly and give a $1000 grant to the application they see as the most awesomely deserving. I was fortunate enough to be chosen for May.

As a student, this is greatly appreciated and I have to say that I’m unimaginably grateful to the Sydney Awesome Foundation. Not only is the money a massive help, but hearing that your work is awesome is, indeed, very awesome. And certainly deserving of far more street cred than any other grant. I definitely encourage anyone working on an awesome project to apply (it’s suspiciously simple!) and also check out some of the other awesome ideas that have been supported by the group.

Something you may have spotted in the Awesome Foundation blog post about my idea is that it talks of creating devices for three different children. This is because my application was submitted in March, when I was still ambitious/naive enough to think that it was possible within the ever-shrinking uni year to get that done. During one of my presentations to academics late in semester, they encouraged me to focus on one child, allowing me to create more iterations and show the development of a device, rather than a comparison of participant responses (which I’m not qualified to do anyway).

A further discussion also resulted in some design changes in the past few weeks. In an effort to remove the confusion of a section of the device not being interactive – just storing the ‘brain’ of the unit – I have now laid out the device as a row of three (instead of a square of four)…


3D design of Device 1

Very fortunately for me, I have a good friend that can fabricate acrylic objects (who also helped me with my major project last year). I kept the design simple so we could get it turned out quickly, and made sure I included plenty of air vents around the side and back (the boards shouldn’t generate too much heat, but I really can’t be sure how the device will be used once it’s in the home, and it’s best to play it safe). This image is a quick check of the layout within the device…


Checking component spacing in acrylic housing

I’ll quickly break down what is going into each section of the device, which will hopefully make things a bit clearer. Something that I’ve decided to do for the first iteration is to daisy-chain an Arduino Uno board for each section. The Uno’s are all connected to a Freetronics EtherTen board, which will power the entire unit and send me usage feedback at Pachube. This certainly isn’t the cheapest way to approach the device, but with different boards starting to cause me trouble with conflicting inputs/outputs, it became more important to find the path of least resistance. Getting feedback from the device being used is key now, and I can always go back to refining the process later.

First up is the audio section. Using the Adafruit Wave Shield and an RFID module, the interaction for this end of the device will require the child to place objects on top of the device. By placing an RFID tag within each of these objects, each will be unique and will trigger a different audio file…


The centre section of the device is the tactile area. This part also houses the ‘brain’ of the unit, which will feed all the incoming data from each section back to a database, and inform further iterations of the device design. This section uses a DFRobot Motor Shield to control vibrating motors. The motors will be switched on/off by the amount of pressure placed on the flex sensors on top of the device…


Finally, the visual section. This area has seen some last minute changes, after having troubles powering the EL Escudo shield. I’m now instead using the EL Sequencer, which is more or less a standalone board. It will receive data from a proximity sensor via the Uno, which was another late addition (originally another RFID module) – an attempt to use a different form of interaction, allowing the participant as many options as possible. This section centers around electroluminescent wire (EL wire), driven by a battery and inverter, which will light up the EL wire when anything is placed within range of the proximity sensor…


Keeping everything as modular as possible really has speeded up the process. Although still quite buggy, each of the interactions is now working – including being daisy chained off a central board. Over the next week or two, I’m hoping to refine the device and get it ready to be placed into the home. Exciting times ahead!


Leave a Reply