Designing the user experience in exhibition spaces - Elisa Rubegni, Caporali Maurizio, Antonio Rizzo, Erik Grönvall

From Media Design: Networked & Lens-Based wiki
Revision as of 13:05, 1 December 2021 by Martin (talk | contribs) (Created page with "= Notes = " The innovative part of this project doesn’t concern the technologies used but the new user experience modality enabled by technology in order to access and mana...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Notes

" The innovative part of this project doesn’t concern the technologies used but the new user experience modality enabled by technology in order to access and manage the contents. The proposed interaction model is based on the post- Wimp paradigm. The exhibition space is pro-active and affords user’s senso-motorial patterns in order to enhanced the experience of the user during the visit through the “magic wand”. Being a completely transparent tool it allows visitors to increase their experience visiting the exhibition. The magic wand acts as a mediator enabling the user to stay focused on the exhibition instead of the tool of interaction." (abstract)



HCI ((Human Computer Interaction) / direct manipulation

"The principles of direct manipulation (Shneiderman, 1983) are fundamental concepts within the HCI (Human Computer Interaction) leading the development of graphical user interfaces towards Wimp (acronym that stands for Windows, icons, menus and pointing device)."

WIMP GUI (graphical user interface) intention

"The Wimp GUI (Graphic User Interface) improves the human computer interaction allowing a simple and easy access to the applications and consequently it allows an increasing amount of users."

WIMP interaction model limits

"The Wimp GUI, even if supporting the user on his activity better then the command-line environment, doesn’t fill the semantic gap on the side of execution between the intention and the execution of the action (Norman, 1990)."

"The Wimp model is based on the fact that computers have only keyboard and mouse as input devices that are limited in comparison with the users set of actions. In this perspective Wimp interfaces don’t solve the gap between the users actions and set of commands. Even if the Wimp model is a step forward in respect of the command-line interface, a semantic distance between formulation of intention and action execution still remains."

What is the WIMP interaction model

"According to Beaudouin-Lafon the Wimp interaction model (Beaudouin-Lafon, 2000) can be defined as: application object displayed in document windows, object can be selected and sometimes dragged and dropped between different windows, commands are invoked through menus or toolbars, often bringing up a dialog box that must be filled in before the command’s effect on the object is visible."

Post-Wimp models

"The post-wimp interaction model has the purpose to reduce this gap allowing an interaction between the user and the manipulated object."

About Memex

"Memex consists of a desk, and while it can presumably be operated from a distance, it is primarily the piece of furniture from where a user works. On the top are slanting translucent screens, on which material can be projected for convenient reading. The Memex vision strongly inspired the concepts of Desktop, Hyperlink and the research activity of the Xerox PARC Lab and the Ted Nelson project Xanadu."

Memex is the name of the hypothetical electromechanical device that Vannevar Bush described in his 1945 article "As We May Think".

Other examples

"Currently, the most common examples of post-Wimp interaction commercially available today are pen-based handwriting recognition (such as Graffiti and Jot) used in hand-held PDAs and TabletPCs (such as Penabled). In these cases the user interacts with the Wimp interface through a non Wimp object (touch screen technology and the pen)"

Graspable interfaces (Fitzmaurice et al., 1995)

"In this kind of interfaces physical objects are used as input devices to manipulate virtual objects. This increases the user experience since interaction occurs in the real world. An example of Graspable interfaces is the Arcade games. The user interacts with the virtual reality through objects such as joysticks, pads, steering wheel and gear consoles. For example, in the golf arcade games the player swings a real club and hits a real ball. The ball trajectory is then simulated and displayed on the screen."