Posted by Jennifer Chaffee
In the real world, objects that look like knobs, buttons and handles are mechanisms that help us to control devices. The shape of a handle or knob can help us to anticipate how it will behave. A handle shaped like a lever encourages us to push down on it. A slider on an audio device suggests pushing it back and forth to control the output. These are learned behaviors that we apply to how we interact with the world around us. From the moment we turn on a mechanical device with a virtual interface we move from the real world to the virtual world and the transition should be effortless. The virtual world should be made for us to interact with in a way that is just as obvious as the real world.
Buttons and other controls in the virtual world of the user interface are more obvious when they look selectable. Designers can give the user visual hints about what the controls will do or how it will behave. In the world of psychology and user interface design, this is referred to as the object’s perceived affordance, how is the object is perceived and what can be done with it. Using the real world as our point of reference, a button can perceived to be “pressed”, a toggle switch “toggled”, a slider is “slid”, a handle is pulled, etc. In Don Norman’s essay, Affordance, Conventions and Design (part 2), he states that these are perceived affordances since we are using a mouse cursor (or finger in the case of a touch screen) to interact with the screen controls. This concept of perceived affordance allows the visual design to be a bridge between functionality and usability in the user interface.
When designing control elements for the UI, the visual cues should be based on how the elements will behave. The visual elements work together to create a story and communicate a meaning. Obviously, the keys on an onscreen keyboard can look similar to a real world keyboard. A menu bar can be more of a challenge. Does every command on a menu bar need to look like it can be pressed? Should each type of control look different in the menu bar if it invokes different types of behavior? For example, one command on a menu bar could take you to an entirely new screen, another command on the menu bar could take you to a dialog and another command on the menu bar could open a set of tools in a floating palette. One might argue that trying to make each menu bar control look different would get too complex. How we apply perceived affordance has to be balanced by keeping the interface from becoming cluttered and confusing.
The design of both real and virtual world is affected by trends. There is a minimalist trend in design now to make things look thin, flat and streamlined. But this can be taken too far and cause frustration for the user. Affordance in the user interface should not become a concept that is ignored for the sake of looking cool. In the real world, I have a tower CPU with an on/off button that is down near the bottom of the front panel, flush with the CPU’s surface and cannot be found easily by feeling for it. Especially when I reach down under my desk to turn it on. I have to bend down and search for it. Compare that to another tower CPU I have that has the button on top of the tower. That button is a wide, illuminated colored plastic button, slightly raised above the surface that is clearly lit when it’s on and easy to find even when it’s off. I can find it with my eyes closed.
The minimal affordance button is a good example of where minimalist design has been taken to an extreme for the sake of style without, in my opinion, enough consideration of how a person will interact with the device. Controls in both the real world and the virtual world don’t have to hit us over the head to be obvious, but they do need to be considered from the user’s point of view, not just the designer.