Wednesday, January 10, 2007

Natural Human-Computer Interaction

With the development of information technology in our society, we can expect that computer systems to a larger extent will be embedded into our environment such as homes and offices. These environments will impose needs for new types of human-computer-interaction, than the keyboard, the mouse and the remote control.
They will claim interfaces that are natural and easy to use, and allow people to interact with them the way they do with other people. In particular, interfaces that make it possible to interact with computerized equipment without need for special external equipment.

These interfaces are not based on menus, mice, and keyboards but use instead gesture, speech, affect, context, and movement.
Their applications are not word processors and spreadsheets, but smart homes and personal assistants: “instead of making computer-interfaces for people, it is of more fundamental value to make people-interfaces for computers”.

The most important factor in making these applications possible in recent years has been the novel viability of real-time computer vision and speech understanding.

Systems coupled with natural interfaces will enable tasks historically outside the normal range of human-computer interaction by connecting computers to phenomena (such as someone walking into a room) that have traditionally been outside the scope of traditional user-interfaces. With natural interfaces the user experiences a form of context awareness, exploiting dialog modalities and behaviors that are commonly used in ordinary activities in his/her real daylife.

Three categories:

a) Virtual reality /augmented reality environments;
b) Perceptual interfaces (natural language interfaces based on speech understanding or gesture interfaces based on computer-vision based interfaces) ;
c) Mixed solutions

PERCEPTUAL INTERFACES BASED ON COMPUTER VISION

Main advantages of using visual input in this context are that visual information makes it possible to communicate with computerized equipment at a distance, without need for physical contact with the equipment to be controlled.

Compared to speech commands, hand gestures or body postures are advantageous in noisy environments, in situations where speech commands would be disturbing, as well as for communicating quantitative information and spatial relationships. The idea is that the user should be able to control equipment in his environment as he/she is, and without need for specialized external equipment, such as a remote control.