Interface Design Machine 2.0

CLIENT:                     Unitec

PROJECT DATE:       2012

COLLABORATION:   Noel Zeng & Peter Koraca
As a core concern of my current project I have tried to understand what are people’s natural gestures, rather than the learnt behaviours that we currently employ with touchscreen based devices.  Obviously, some of these gestures are themselves now natural, but after processing the outputs of the first Interface Design Machine, I realised that people have many different interpretations of how to act with an audio device.

IDM 2.0 is a web based app that leverages the ubitousness of the iPad and other tablets to capture user’s gestural response to a series of common commands used in audio players, such as ‘Volume Up’.  The app starts with a short instructional video and then cycles through a series of audio commands.  The touch inputs are returned as CSV values:

User ID/Screen Width/Screen High/Gesture/Milliseconds/Finger ID 1/X-axis/Y-axis/Finger ID 2/X-axis….

I can then use these values to create a series of classifying Self Organised Maps in MATLAB, which will in turn reveal the initial values for the gesture inputs in the final application of Biokinesis.  Additionally there is a Processing sketch, initially coded by Peter Koraca, which visualises the entire collated outputs for users to see on the linked webpage.

Processing 2 from Ben Jarrett on Vimeo.

IDM 2.0 is live, but recently uploaded, so I don’t have enough data to formally process as yet.

 

IDM2 Tablet Video 1200kbps from Ben Jarrett on Vimeo.

IDM 2 Talkthrough SD from Ben Jarrett on Vimeo.

Leave a comment