CLIENT: Unitec
PROJECT DATE: 2011 – 2013
COLLABORATION: Dr Nigel Yee
This is a component of my current Masters project. The project seeks to remove dennotative symbol based communication with an abstract interface that uses movement to convey interface logic.
The neural network that I am proposing acts as the structure on which the UI is based upon. By using a neural network as the base structure of the application, we can create a ‘living’ interface that uses both audio and gestural inputs to moduate the grid based output both instantaeously and longer term. The application uses the neural network’s outputs as ‘base’ variables whilst using C (or similar) to produce output.
It should be pointed out that I am at the lower level of coding ability currently, so this is a technical work in progress, but the basic concept of a ‘living interface’ is finalised. The reasoning for this form of interfacing is to create a form of emotional sustainablity between the object and the individual that is no longer dictated by the desire for newer-better, but instead for a dialogue that is deeper-richer.
As this is part of my current Masters it should be seen as an ongoing work in development.