Hands-on Computing
   HOME

TheInfoList



OR:

Hands-on computing is a branch of human-computer interaction research which focuses on computer interfaces that respond to human touch or expression, allowing the machine and the user to interact physically. Hands-on computing can make complicated computer tasks more natural to users by attempting to respond to motions and interactions that are natural to human behavior. Thus hands-on computing is a component of
user-centered design User-centered design (UCD) or user-driven development (UDD) is a framework of processes in which usability goals, user characteristics, environment, tasks and workflow of a product, service or brand are given extensive attention at each stag ...
, focusing on how users physically respond to virtual environments.


Implementations

* Keyboards * Stylus pens and tablets *
Touchscreen A touchscreen (or touch screen) is a type of electronic visual display, display that can detect touch input from a user. It consists of both an input device (a touch panel) and an output device (a visual display). The touch panel is typically l ...
s * Human signaling


Keyboards

Keyboards and
typewriter A typewriter is a Machine, mechanical or electromechanical machine for typing characters. Typically, a typewriter has an array of Button (control), keys, and each one causes a different single character to be produced on paper by striking an i ...
s are some of the earliest hands-on computing devices. These devices are effective because users receive kinesthetic feedback, tactile feedback, auditory feedback, and visual feedback. The
QWERTY QWERTY ( ) is a keyboard layout for Latin-script alphabets. The name comes from the order of the first six Computer keyboard keys#Types, keys on the top letter row of the keyboard: . The QWERTY design is based on a layout included in the Sh ...
layout of the keyboard is one of the first designs, dating to 1878.Baber, Christopher. ''Beyond the Desktop''. Academic Press. 1997. New designs such as the split keyboard increase the comfort of typing for users. Keyboards input directions to the computer via
keys Key, Keys, The Key or The Keys may refer to: Common uses * Key (cryptography), a piece of information needed to encode or decode a message * Key (instrument), a component of a musical instrument * Key (lock), a device used to operate a lock * ...
; however, they do not allow the user direct interaction with the computer through touch or expression.


Stylus pens and tablets

Tablets are touch-sensitive surfaces that detect the pressure applied by a stylus pen. This works via changes in magnetic fields or by bringing together two resistive sheets, for magnetic tablets and resistive tablets respectively. Tablets allow users to interact with computers by touching through a stylus pen, yet they do not respond directly to a user's touch.


Touchscreens

Touchscreen allow users to directly interact with computers by touching the screen with a finger. It is natural for humans to point to objects in order to show a preference or a
selection Selection may refer to: Science * Selection (biology), also called natural selection, selection in evolution ** Sex selection, in genetics ** Mate selection, in mating ** Sexual selection in humans, in human sexuality ** Human mating strat ...
. Touchscreens allow users to take this natural action and use it to interact with computers. Problems may arise due to inaccuracy: people attempt to make a selection, but due to incorrect calibration, the computer does not accurately process the touch.


Human signaling

New developments in hands-on computing have led to the creation of interfaces that can respond to gestures and facial signaling. Often haptic devices like a glove have to be worn to translate the gesture into a recognizable command. The natural actions of pointing, grabbing, and tapping are common ways to interact with the computer interface. The latest studies include using
eye tracking Eye tracking is the process of measuring either the point of gaze (where one is looking) or the motion of an eye relative to the head. An eye tracker is a device for measuring eye positions and eye movement. Eye trackers are used in research ...
to indicate selection or control a cursor. Blinking and the gaze of the eye are used to communicate selections. Computers can also respond to speech inputs. Developments in this technology have made it possible for users to dictate phrases to the computer instead of type them to display text on an interface. Utilizing human signal inputs allows more people to interact with computers in a natural way.


Current problems

There are still many problems with hands-on computing interfaces that are currently being eradicated through continuing research and development. The challenge of creating a simple, user-friendly interface and developing it in an inexpensive and mass-producible way is the main complication in hands-on computing technologies. Because some interactions between human and machine are ambiguous, the mechanical response is not always the desired result for the user. Different hand gestures and facial expressions can lead the computer to interpret one command, while the user wished to convey another one entirely. Solving this problem is currently one of the main focuses in research and development. Researchers are also working to find the best way to design hands-on computing devices, so that the consumer can use the product easily. Focusing on
user-centered design User-centered design (UCD) or user-driven development (UDD) is a framework of processes in which usability goals, user characteristics, environment, tasks and workflow of a product, service or brand are given extensive attention at each stag ...
while creating hands-on computing products helps developers make the best and easiest-to-use product.


Research and development

This new field has a lot of room for contributions in research and product development. Hands-on computing technologies require scientists and engineers to use a different problem-solving strategy, which considers the devices for interaction rather than just input, the interaction devices in terms of tool use, how interaction will mediate user performance, and the context in which the devices will be used. In order for a machine to be successfully used, people need to be able to transfer some of their current skill set to operate it. This can be done directly, by comparing the interface to a known and familiar topic to help people understand, or by aiding the user to draw new inferences through feedback. Users must be able to understand how to use and manipulate the interface, in order to use it to its full capability. By applying their current skills, users can operate the machine without learning new concepts and approaches.Waern, Y. "Human Learning of Human-Computer Interaction: An Introduction." Cognitive Ergonomics: Understanding, Learning and Designing Human-Computer Interaction (1990): 69-84.


References


"ThinSight"
Microsoft Research and Development. 19 November 2008. * "Office XP Speaks Out". Microsoft PressPass. 18 Apr. 2001. Microsoft. 5 December 2008. {{reflist Human–computer interaction