General gesture recognition



Beyond Fitts' Law: Models for Trajectory-Based HCI Tasks
Abstract: Trajectory-based interactions, such as navigating through nested-menus, drawing curves, and moving in 3D worlds, are becoming common tasks in modern computer interfaces. Users' performances in these tasks cannot be successfully modeled with Fitts' law as it has been applied to pointing tasks. Therefore we explore the possible existence of robust regularities in trajectory-based tasks. We used "steering through tunnels" as our experimental paradigm to represent such tasks, and found that

Analysing Mouse and Pen Flick Gestures
Abstract: Abstract: Gesture based interfaces promise to increase the efficiency of user input, particularly in mobile computing where standard input devices such as the mouse and keyboard are impractical. This paper describes an investigation into the low-level physical properties of linear `flick' gestures that users create using mouse and pen input devices. The study was motivated by our need to determine sensible constraints on values such as the magnitude, timing, and angular accuracy of gestures for a...

Accuracy Measures for Evaluating Computer Pointing Devices
Abstract: In view of the difficulties in evaluating computer pointing devices across different tasks within dynamic and complex systems, new performance measures are needed. This paper proposes seven new accuracy measures to elicit (sometimes subtle) differences among devices in precision pointing tasks. The measures are target re-entry, task axis crossing, movement direction change, orthogonal direction change, movement variability, movement error, and movement offset. Unlike movement time, error rate,...

Effects of Lag and Frame Rate on Various Tracking Tasks
Abstract: Virtual environments involve the user in an interactive three-dimensional computer generated environment. The methods of interaction typically involve direct manipulation of virtual objects via three-dimensional trackers. The tracking signal may be degraded in various ways, impacting the ability of the user to perform various tasks. This presentation will address the impact of two types of degradation in the tracking signal, lag (transport delay) and low frame rate. These degradations are...

More than dotting the i's Foundations for crossing-based interfaces
Abstact: Today's graphical interactive systems largely depend upon pointing actions, i.e. entering an object and selecting it. In this paper we explore whether an alternate paradigm-- crossing boundaries -- may substitute or complement pointing as another fundamental interaction method. We describe an experiment in which we systematically evaluate two targetpointing tasks and four goal-crossing tasks, which differ by the direction of the movement variability constraint (collinear vs. orthogonal) and by

What You Feel Must Be What You See: Adding Tactile Feedback to the Trackpoint
Abstract: The present study makes two contributions to the literature on tactile feedback. First, it investigates the effect of tactile feedback in isometric rate control devices. The use of tactile feedback in this type of device has not been systematically investigated. An isometric joystick, such as the IBM Trackpoint^TM in-keyboard pointing device does not perceptibly move and is operated by force. Can tactile information delivered to the user's fingertip through such a device provide a feeling of...

A Survey of Gesture Recognition Techniques
Abstract: Processing speeds have increased dramatically, bitmapped displays allow graphics to be rendered and updated at increasing rates, and in general computers have advanced to the point where they can assist humans in complex tasks. Yet input technologies seem to cause the major bottleneck in performing these tasks: under-utilising the available resources, and restricting the expressiveness of application use. We use our hands constantly to interact with things: pick them up, move them, transform...

Adaptive Classification of Hand Movement
Abstract: Hand sign recognition, in general, may be divided two stages: motion sensing that extracts some useful movement data from signer's motion; and classification process that classifies the movement data as a sign. We have developed a prototype of the Hand Sign Classification (HSC) system that classifies a series of the full degrees-of-freedom kinematic data of a hand into sign language signs. It is built as a fuzzy expert system in which the sign knowledge can be represented by a high level...

Hand Tension as a Gesture Segmentation Cue
Abstract: Hand gesture segmentation is a difficult problem that must be overcome if gestural interfaces are to be practical. This paper sets out a recognition-led approach that focuses on the actual recognition techniques required for gestural interaction. Within this approach, a holistic view of the gesture input data stream is taken that considers what links the low-level and high-level features of gestural communication. Using this view, a theory is proposed that a state of high hand tension can be...

Providing Integrated Toolkit-Level Support for Ambiguity in Recognition-Based Interfaces
Abstract: Recognition technologies are being used extensively in both the commercial and research worlds. But recognizers are still error-prone, and this results in performance problems and brittle dialogues. These problems are a barrier to acceptance and usefulness of recognition systems. Better interfaces to recognition systems, which can help to reduce the burden of recognition errors, are difficult to build because of lack of knowledge about the ambiguity inherent in recognition. We have extended a user interface toolkit in order to model and to provide structured support for ambiguity at the input event level ...
Also: associated thesis An Architecture and Interaction Techniques for Handling Ambiguity in Recognition-based Input
And: Interaction techniques for ambiguity resolution in recognition-based interfaces


Dynamic System Representation, Generation, and Recognition of Basic Oscillatory Motion Gestures
Abstract: We present a system for generation and recognition of oscillatory gestures. Inspired by gestures used in two representative human-to-human control areas, we consider a set of oscillatory (circular) motions and refine from them a 24 gestures lexicon. Each gesture is modeled as a dynamic system with added geometric constraints to allow for real time gesture recognition using a small amount of processing time and memory. The gestures are used to control a pan-tilt camera neck. We propose extensions for use in areas such as mobile robot control and telerobotics.

Vision-Based Gesture Recognition: A Review
Abstract: . The use of gesture as a natural interface serves as a motivating force for research in modeling, analyzing and recognition of gestures. In particular, human computer intelligent interaction needs vision-based gesture recognition, which involves many interdisciplinary studies. A survey on recent vision-based gesture recognition approaches is given in this paper. We shall review methods of static hand posture and temporal gesture recognition. Several application systems of gesture

A Two-stage Scheme for Dynamic Hand Gesture Recognition
Abstract: In this paper a scheme is presented for recognizing hand gestures using the output of a hand tracker which tracks a rectangular window bounding the hand region. A hierarchical scheme for dynamic hand gesture recognition is proposed based on state representation of the dominant feature trajectories using an a priori knowledge of the way in which each gesture is performed.

Impact of Dynamic Model Learning on Classification of Human Motion
Abstract: The human figure exhibits complex and rich dynamic behavior that is both nonlinear and time-varying. However, most work on tracking and analysis of figure motion has employed either generic or highly specific hand-tailored dynamic models superficially coupled with hidden Markov models (HMMs) of motion regimes. Recently, an alternative class of learned dynamic models known as switching linear dynamic systems (SLDSs) has been cast in the framework of dynamic Bayesian networks (DBNs) and applied...

On-line 3D gesture recognition utilising dissimilarity measures
Abstract: In the field of Human-Computer Interaction (HCI), gesture recognition is becoming increasingly important as a mode of communication, in addition to the more common visual, aural and oral modes, and is of particular interest to designers of Augmentative and Alternative Communication (AAC) systems for people with disabilities. A complete microcomputer system is described, GesRec3D, which facilitates the data acquisition, segmentation, learning, and recognition of 3-Dimensional arm gestures. The...

Statistical Gesture Recognition through Modelling of Parameter Trajectories
Abstract: . The recognition of human gestures is a challenging problem that can contribute to a natural man--machine interface. In this paper, we present a new technique for gesture recognition. Gestures are modelled as temporal trajectories of parameters. Local sub-sequences of these trajectories are extracted and used to define an orthogonal space using principal component analysis. In this space the probabilistic density function of the training trajectories is represented by a multidimensional...

A Proposal of Pattern Space Trajectory
Abstract: We propose a new appearance-based feature for real-time gesture recognition from motion images. The feature is the shape of the trajectory caused by human gestures, in the "Pattern Space" defined by the inner-product between patterns on frame images. It has three advantages, 1) it is invariant in term of the target human's position, size and lie, 2) it allows gesture recognition without interpreting frame image contents and 3) there is no costly statistical calculation involved. In this...

Wireless Static Hand Gesture Recognition with Accelerometers The Acceleration Sensing Glove
Abstract: A glove with six 2-axis accelerometers on the finger tips and back of the hand is demonstrated using commercial -off-the-shelf components. With an RF transmitter, the glove can act as a wireless input device to a computer. With gravity-induced acceleration offsets, we have developed a text editor where each hand gesture refers to a letter of the alphabet. Twenty-eight static hand gestures are recognizable with acquisition times of up to 1 characters/second. The glove is a prototype device for...

Person-Independent Continuous Online Recognition of Gestures
Abstract: This paper presents a gesture recognition system based on Hidden Markov Models. It has several user friendly capabilities like person-independent and backgroundindependent recognition. It can distinguish between up to 24 different gestures. An improved system is able to recognize gestures continuously and output the result with no noticeable delay. 1. Introduction Gesture recognition has emerged as one of the most important research areas in the field of human-computer interaction. Although...

The Sound of One Hand: A Wrist-mounted Bio-acoustic Fingertip Gesture Interface
Abstract: Two hundred and fifty years ago the Japanese Zen master Hakuin asked the question, “What is the Sound of the Single Hand?” This koan has long served as an aid to meditation but it also describes our new interaction technique. We discovered that gentle fingertip gestures such as tapping, rubbing, and flicking make quiet sounds that travel by bone conduction throughout the hand. A small wristbandmounted contact microphone can reliably and inexpensively sense these sounds. We harnessed this “sound in the hand” phenomenon to build a wristband-mounted bioacoustic fingertip gesture interface. The bio-acoustic interface recognizes some common gestures that state-of-the-art glove and image-processing techniques capture but in a smaller, mobile package.



Home