Here are some initial representations of a second IDM that expands the path that the first IDM helped create. I believe that the first IDM was valid due to the fact that it showed that we do not all have a common response to processing input commands, despite the ubiquitous gestures that map across most smart devices. IDM1 was not able to capture data in a quantitative manner, so I was unable to expand the knowledge in a manner that we could codify and publish. This second IDM captures the users touch data in a array (Time/Touch1/Touch2/Touch3/etc). This data will be collated and available, and the code will also be open source and available. These two videos are just variations on a theme exploring how this webapp might play out. I am quite keen to make it look seamless from the introduction into the data capture. The coding (HTML5/Canvas) will be done by the very able Noel Zeng.
Interface Design Machine – Two
Leave a comment