Dynamics and Interaction workshop

Tuesday, 2nd August to Thursday, 4th August 2005.

See the photos here!

Location: F121 (conference room), Glasgow University, Department of Computing Science, Dynamics & Interaction group.

Format: The workshop will be a three day workshop, with talks about mobile, instrumented interface design in the mornings, and in the afternoons, participants will split into teams to do some 'extreme programming' (where a local Ph.D. student experienced in working with Pocket PCs, with the MESH sensor pack, will lead the team), in order to build a novel instrumented interface. The talks from a range of invited speakers will involve an introduction to the challenges of working with inertial data, possibilities for non-visual display on mobile devices, and a range of talks on the use of dynamic models and probabilistic reasoning in the design of mobile interaction systems.

The MESH sensor pack is an advanced package of sensors (accelerometers, gyros, GPS, magnetometer), with vibrotactile feedback, developed by the Palpable Machines group at Media Lab Europe.

For videos of the interfaces we have built using these packs see here. For examples of publications which involved building interfaces around this, please see our publication list.

The workshop, which is part of Glasgow University's Computing Science Research Festival, 2005, has also been generously sponsored by the PASCAL network, and the EPSRC, and is part of the Audioclouds project.

We expect this workshop should be of interest to a range of target groups, including Interface Designers, and Researchers in Pattern Recognition/Machine Learning, Sonification/Multimodal interaction, Control theory and also those from the area of human motor control.

Talks

Tuesday

9:00-9:30 Registration & Coffee.

9:30-9:45 Roderick Murray-Smith, Introduction and plan

9:45- 10:15 Matt Jones, Contentment and Context Communication

Lots of people are working on ways of adapting mobile and ubicomp systems to accommodate the user's context. Most attempts fall short, though, as there are big challenges relating to sensing, modelling and communicating contextual factors. In this talk, we will explore some interaction design context issues, and look at an attempt to present contextual information through the novel medium of music. The system we're building - onTrack - continuously adapts the music a user is listening to on their portable, personal player: we'll consider applications from pedestrian navigation to event alerting

10:15-10:30 Steven Strachan, gpsTunes - controlling navigation via audio feedback

10:30 Coffee

11:00 John Williamson, Control-based selection

11:40-12:30 Dynamics & Interaction Demos:
- Parisa Eslambolchilar,
- Steven Strachan,
- John Williamson, Hex,
- Andrew Crossan, Rhythmic interaction
- Andrew Ramsay, Interaction design between fixed and mobile computers.

12:30-13:30 Lunch

13:30-14:10 Parisa Eslambolchilar, Making Sense of Fisheye views
In this presentation we bring different ideas from manual control, gesture recognition, continuous multimodal interaction via tilt input device, and language model together in a focus in context method. In this application we present that human behavior can be modeled in F+C method and demonstrate this model of fisheye views in browsing maps and reading text controlled by tilt sensors via continuous movements. Also, touch screen position control mechanism lets the user to select and magnify the target directly without tilt inputs.
A two-layer neural network trained with back propagation algorithm recognizes the stylus gestures, which are useful in altering the lens parameters. A probabilistic language model is used to infer user intentions and provide predictions about uncompleted words and phrases entered by user in the find edit box, and reduces effort in entering the whole word or phrase. Also, it is well suited for multimodal interaction.

14:10 - 17:00 Break into groups for team projects

Wednesday

9:00-10:00 Sile O'Modhrain & Stephen Hughes, The Development of the MESH sensor platform

Mesh is a prototyping platform for multi modal applications that combines inertial sensing, magnetic sensing, GPS and a high-quality vibrotactile display into a backpack for a standard IPaq. The motivation for developing Mesh was to create a device that could sense where it was, which way it was pointing and how it was being moved. When combined with capabilities for visual, audio and vibrotactile feedback, the Mesh system provides the functionality to explore new forms of interaction with mobile devices that tightly couple gestural input and multi modal display. In this talk we will present a brief overview of the development of the device and talk about some of the prototype applications we have developed using the Mesh system.

10:00-10:30 Andrew Crossan, Rhythmic Interaction: Gait Phase Effects in Mobile Interaction

Rhythmic interaction methods have largely been ignored as an input mechanism. However, they have the potential to offer a natural method of interacting with a device for several types of task. This talk will introduce the area of rhythmic interaction for input and context sensing. It will particularly focus on the problem of conducting mobile usability evaluations, and demonstrate how rhythmic interaction techniques can be used together with a mobile device (instrumented with an accelerometer) to provide a quantitative understanding of the detailed interactions taking place when on the move, allowing us to develop better mobile interfaces.

10:30-11 Coffee

11-11:30 Steven Strachan, Sensor fusion for mobile devices

A brief introduction to sensor fusion, the motivation and the most common methods utilised to fuse data from various sensors in a meaningful way in a mobile context.

11:30-12:30 Verena Hafner, Interpersonal Maps and the Body Correspondence Problem

In this talk, I will introduce a new method for studying sensorimotor coordination, a topic that has become increasingly relevant to Artificial Intelligence in recent years. I will introduce the concept of "interpersonal maps" that can be created using information theoretic measures. Unlike personal bodymaps, which refer only to one's own body, interpersonal maps also include a representation of another's body. Matching and discriminating between oneself and others results from the interplay of several developmental dynamics and involves skills such as imitation. Based on a set of experiments with Sony AIBO robots, I will show that this unified representation can help to elucidate both the formation of a body schema and the body correspondence problem. To conclude, I will give an overview of possible applications of this method to various research areas in Human Machine Interaction.

12:30-13:30 Lunch

13:30-14:00 Allan Sinclair, Tennis Sensation: implementing new interaction techniques in product design
Human-computer interfaces in product design continue to relay on the visual mode of interaction with the result that visually impaired users become excluded. Tennis Sensation is a virtual game that uses new interaction techniques and an inclusive design philosophy to allow its use by a wide range of users.

Thursday

9:30-10:30 Stephen Brewster, Alternative interactions for mobile situations

10:30 Coffee

11:00-11:30 John Williamson, Probabilistic Feedback

11:30-12:30 Roderick Murray-Smith, Dynamics & interaction

12:30-13:30 Lunch.

Registration:

Industrial participants £300
Academic participants £50
PhD. students free. (EPSRC funding means that Ph.D. students can have their accomodation costs covered - contact us for details).
PASCAL network members - free.

Numbers will be strictly limited to 25 participants in order to keep the workshop at a productive size, and to keep the teams in the prototype development part of the project to a maximum of 5 people per team.

We are in the process of setting up the registration process. Payment will be by cheque (made out to the University of Glasgow), and can be handed over at the start of the workshop. If you are interested in being kept informed about this, please contact Roderick Murray-Smith at rod@dcs.gla.ac.uk