A Brief History of Tactile Pin Arrays
Exploring a virtual world through a pen-like stylus or a thimble currently gives little provision for skin sensations in force feedback devices, such as the complex distribution of forces on the skin that are perceived when placing a fingertip on a textured surface. Tactile pin array research seeks to replicate these sensations by using an array of individually controllable mechanical elements to perturb the skin at the user's fingertip.
The development of tactile display technology was motivated by sensory substitution, concerned with presenting a representation of information in one sense (for example, visual) using another modality (for example, audio or tactile). The major application of sensory substitution technologies is in increasing accessibility for the sensory impaired. Tactile-vision substitution systems were the earliest to be developed, in order to present visual information to blind people. In a typical system, a camera receives visual information which is converted to a tactile representation on a two dimensional pin array. Some of the earliest work in this area was the development of the "Optacon", which converted printed letters to a vibrotactile representation using a miniature hand held camera (summarised in ). Early pioneering work in tactile vision substitution was also performed by Paul Bach-y-Rita and colleagues in the late 1960s. Early systems displayed visual information captured by a tripod mounted TV camera to a vibrotactile display on the user's back.
Researchers became interested in the idea of computer mediated tactile sensations through the field of teleoperation.Tactile feedback can be used to relay contact information from the tip of a remote tool or the jaws of a slave manipulator to a human operator. Meaningful tactile feedback may be of great importance for fine control of force and position in grasping and manipulation tasks, and for controlling the level of force exerted at the tip of a tool on a slave device. Motivation for the cross fertilisation of ideas between earlier sensory substitution work and teleoperation arose from the need to provide increased tactile sensitivity to astronauts during extra vehicular activity on space missions. The thick, many layered protective gloves worn by astronauts preclude tactile information during complex dextrous tasks such as satellite servicing or space vehicle maintenance. Pioneering work in this area was performed by Paul Bach-y-rita and colleagues, who investigated the presentation of sensed forces on the fingers to electrotactile displays on the torso of astronauts.
The need to provide tactile cues became more prevalent as more sophisticated master-slave systems were developed for applications such as remote surgery. Shimoga provides a review of tactile feedback technology that could potentially be applied in design of teleoperator systems. The review reports the work of Patrick  as some of the earliest work applying vibrotactile feedback to an exoskeleton hand master. In related early work on vibration display, Minsky and colleagues describe the sandpaper system, which presented virtual textures via a two degrees of freedom force feedback joystick . Caldwell and Gosney provide some of the earliest reported work regarding display of remotely sensed texture and slip to a human operator via a data glove enhanced with piezoelectric vibratory displays . Their system was also capable of thermal and pressure sensations. Perhaps the earliest exponents of teleoperation systems incorporating both force feedback and pin array displays were Robert Howe and colleagues in the mid-1990s .
Increases in computing power, the notions of "virtual reality", and the commercial availability of haptic force feedback devices as a research tool have effectively shifted the main emphasis of tactile display research from representing remotely sensed real world information to the challenges inherent in interacting with an environment consisting of simulated physical models on a computer. The drive to create realistic cutaneous stimulation for virtual environments could now be said to be the motivation behind most tactile display research. In particular, the need to combine tactile display with force feedback displays has led to increased efforts to miniaturise the technology. The growth in research activity in this area, and the advent of commercially available tactile pin arrays for researchers and end users has also presented renewed opportunity to address the implications of this technology for accessibility and sensory substitution applications.
We are using the VTPlayer tactile mouse to investigate presenting distributed tactile feedback to the fingertips. The device (Figure 1) is similar to a standard computer mouse, except that the user rests their index and middle fingers on the two tactile displays located on top of the chassis. The displays each consist of a 4 by 4 matrix of pins that are each individually controllable through software. Pins can either be raised or lowered; under standard operation, the states of the pins are related to the colour of the pixels surrounding the mouse pointer - a dark pixel corresponding to a raised pin and a light pixel to a lowered pin (exact thresholds are unknown). Thus, a user can actively explore a desktop, menu, or other on-screen environment and receive tactile information regarding what is under the mouse pointer.
Figure 1. VTplayer tactile mouse.
Exploring a complex environment, such as a desktop or other visually oriented graphical user interface, is extremely difficult from tactile information alone. The work on the project so far has focussed on perceptual issues of resolving information presented on a tactile display, and their processing and comprehension by a user, so that appropriate stimuli can be designed for use in tactile applications for blind computer users. So far we have looked at perception of line gradient and texture patterns with the device. We are also interested in active and passive exploration, and the role of kinaesthetic feedback and control in the process of resolving information.
In conjunction with the Royal National College for the Blind in Hereford, UK, we have developed a system that has integrated the VTPlayer in to an application that allows browsing of a speech and tactile based representation of bar charts (Figure 2). Input is provided through a graphics tablet and stylus with the dominant hand. Using the graphics tablet as an input device supports acquisition of contextual information through proprioceptive feedback ("where am I on the graph?"). The pins of the VTPlayer provide information to the non-dominant hand, if the stylus is positioned over a bar. Thus, the user can use the tactile cues for guidance and navigation ("am I on a bar?") and also to indirectly obtain an estimate of the value of the bar ("how high is the bar?"). Direct access to graph labels and values is provided through speech audio ("what is this bar?"). We hypothesise that the tactile cues provided by the mouse support the user while navigating the bar chart, and will therefore aid in efficient target acquisition. We will be investigating this in a forthcoming study with the system. Currently, blind people typically access digitally stored data through a screen reader and shortcut keys. Our system provides an approach analogous to "direct manipulation", by allowing a point-and-click style interaction using proprioceptive, tactile and audio feedback. This promotes a less sequential and less time consuming means of accessing the data. The system also promotes a shared representation with sighted colleagues, potentially allowing a blind pupil in a mainstream school to work with sighted classmates.
Figure 2. The VTPlayer and graphics tablet systems. Supports access to data through proprioceptive, tactile and audio feedback.
Page maintained by:
© copyright 2004-2005 University of Glasgow. All rights reserved.
Last Modified: November 9, 2006