The main idea is that people should not be restricted to standard GUI (Graphical User Interface) when interacting with computer systems, or more accurately, interacting with information. Instead, the authors suggested using physical objects in the real world, because we have developed rich languages and cultrues with valued haptic interaction with real physical objects.
Our intention is to take advantage of natural physical affordances to achieve a heightened legibility and seamlessness of interaction between people and information.So how exactly would one go about doing this? The authors focused on graspable objects, which they called phicons (stands for physical icons), and associated them with functions using metaphors. Three prototypes were presented as demonstrations: metaDesk, transBoard and ambientRoom.
This design included a nearly horizontal back-projected graphical surface (the desk), an arm-mounted LCD screen, and a passive optically transparent "lens". Here a phicon could be a small model of the famous Great Dome building at MIT. Once it is placed on the desk, the display will show a 2D map with the location of the Great Dome building right underneath the phicon. Then as the user moves or rotate the phicon, the map will move and rotate accordingly. A second phicon (model of the Media Lab building) can also be placed on the map, and then they can be used simutaneously to scale the map. The arm-mounted LCD can be used to display 3D model of the map and let the user traverse the 3D model as the arm is moved. The transparent "lens" can be used kind of like a magnifying glass on the desktop display to review hidden information about each building.
The video below is a demonstration of the metaDESK system.
The idea behind this system is that while we get information from what we are focusing on, such as the person we are having a dialog with, we also get information from ambient sources, such as passing by traffic, the lighting and weather condition. Therefore, if we could present information as ambient background and then allow users to manipulate information with phicons, we can let the user mostly focus on his main tasks, such as reading emails, and still be able to monitor other information flow inactively, and also be alerted of abnormal situations from background information sources. The example given in the paper was about displaying a web site traffic as ambient background. First, the authors tried using sound of raindrops to simulate web page hits. Eventually they settled for ripples on the surface of water by light projecting onto a water tank.
The video below is a demonstration of this sytem.
This was implemented on a SoftBoard product that monitors the activity of tagged physical pens and erasers with a scanning infrared laser. "hyperCARDs" (barcode-tagged paper cards) are used as container of digital strokes and broadcast live to remote users who might be monitoring the session with an identical hyperCARD. Then this hyperCARD can be brought home or office like index cards.
What makes this paper exciting to me is the idea of using real world physical objects and their natural affordances as metaphors to interact with information, especially, to interact with a robot. In the past research of our lab, a model airplane was also used as a phicon to command a UAV (Unmanned Air Vehicle). The operator could simply hold and turn the model airplane, and the UAV would perform the same maneuver in mid-air. The metaphor is very intuitive and mapped very well with the information we need to manipulate.
At the end of the paper, there is also a fun discussion about optical metaphors, and how they can be coupled with digital information. It is an overall interesting read. But to get a better understanding of how the prototype systems work, the videos are better.
Librate your mind. Inspirations can be found from the many everday objects around you.