Glasgow Interactive Systems Group (GIST) logo - click here to go to the GIST Website

The Multimodal Interaction Group website has moved. Click here to visit the new site

Research projects within the Group

There is a lot of interesting research going on in the group (feel free to contact anyone in the group if you want to know more - contact details are on the home page). We are really interested in multimodal human-computer interaction - using different sensory modalities to communicate information. Most current interfaces rely almost entirely on vision to present information. This is not natural and can cause sighted users to become overloaded and is a major problem for people with sight impairments.

What we are doing is investigating the use of other senses for human-computer interaction, in particular hearing and touch. This page will tell you about what we are doing and will provide links to other resources. If you have any questions, comments or queries then feel free to contact me.

Topics:

Other useful stuff:


Earcons and sonically-enhanced widgets

I have been doing work with earcons since 1990 with many interesting results and have used them for many different things. Here is a chapter from the Human-Computer Interaction Handbook (Sears and Jacko) that I wrote on audio interfaces that will give you some background and history to the topic. Earcons were first proposed by Meera Blattner in 1989. They are abstract, musical tones that can be used in structured combinations to create auditory messages. Blattner defines earcons as "non-verbal audio messages that are used in the computer/user interface to provide information to the user about some computer object, operation or interaction". They are based on musical sounds. Detailed investigations of earcons by me, Peter Wright & Alistair Edwards showed that they are an effective means of communicating information in sound. Give me more details

What do earcons sound like?

The earcons in this and the following pages are all less than 100K. This means the quality is reduced but (hopefully) you should be able to download them quickly. Click on the icons below to play two example earcons.

Sound iconSound icon

Click here to find out more about the earcons you've just heard and to hear more. They were created as part of a detailed study to investigate the effectiveness of earcons. You can also download a HyperCard stack (400K) containing many example earcons. The earcons in this stack are described in more detail by clicking on the previous link. Here are some guidelines for designing and creating earcons.

My thesis

The title of my PhD thesis was:
Providing a structured method for integrating non-speech audio into human-computer interfaces.

It is probably the most detailed document concerning earcons and it also has lots about auditory human-computer interfaces in general. Look at the abstract and introduction to get an overview, or the complete thing is available as a PDF file, or on my publication list. The chapter from the Human-Computer Interaction Handbook (Sears and Jacko) that I wrote on audio interfaces that will give you some background and history to the topic and is a bit more up-to-date than my thesis.

Sonically-enhanced widgets and a toolkit of resource sensitive widgets

We have done a lot of work looking at how non-speech sounds can be incorporated into standard graphical human-computer interfaces. This can improve performance and increase usability for both sighted and partially-sighted users. Our results have shown that sound can provide many benefits: from increased performance and user preference to reduced workload.

Jo Lumsden is an RA and Murray Crease a PhD student both working in this area. Ashley Walker used to work on this project. We have created a whole set of Web pages for this project . These have lots if useful information and a lot of Java demos of sonically-enhanced widgets to play around with. Our publications in this area are also available. This work is funded by the EPSRC.


Earcons and telephone-based interfaces

We have done several experiments to investigate the use of earcons for representing hierarchical information. This has applications for telephone-based interfaces (such as phone-banking or voicemail), the design of mobile phones and blind computer users. Telephone-based interfaces (TBIs) are becoming an increasingly important method for interacting with computer systems. The telephone is an ubiquitous device and is many people’s primary method of entry into the information infrastructure. Access to an increasing number of services is being offered over the telephone, such as voice-mail, electronic banking and even Web pages. The rapidly increasing use of mobile telephones means that people access these services at many different times and places. Telephones themselves are now also incorporating greater functionality (such as multi-party calling, address books, diaries or call forwarding). The provision of this extra functionality may be rendered useless if usability issues are not considered. These pages have all of the details on the project funded to look into earcons for telephone based interfaces.


Three-dimensional sound

We are now begining to use 3D sound to expand the audio display space. This allows you to present sounds as coming from around the users head - in front, behind, above or below. This gives us much more space to represent audio feedback and can stop the audio space becoming as cluttered as the visual (especially important on mobile computing devices). The approach we are taking is to use headphones (this improves the quality of the 3D sound) and a headtracker. Users can move their heads around and the sounds are recomputed to remain in the correct place. This gives us high quality 3D sound at low cost. Publications so far in this area:

  • Walker, A. and Brewster, S.A. Trading Space for Time in Interface Design. In Volume II of the Proceedings of INTERACT '99 (Edinburgh, UK) British Computer Society, 1999, pp. 67-68.
  • Walker, A. and Brewster, S.A.(2000). Spatial audio in small display screen devices. Personal Technologies, 4(2), pp 144-154. Adobe PDF
  • Brewster, S.A. and Walker, V.A. (2000). Non-Visual Interfaces for Wearable Computers. IEE Workshop on wearable Computing (00/145). IEE Press. Adobe PDF
  • Walker, A. and Brewster, S.A. "Sitting too close to the screen can be bad for your ears": A study of audio-visual location discrepancy detection under different visual projections.In proceedings of ICAD2001 (Helsinki, Finland), ICAD and Helsinki University of Technology, pp 86-89 . Adobe PDF
  • Walker, A., Brewster, S.A., McGookin, D. and Ng, A. Diary in the sky: A spatial audio display for a mobile calendar. In Proceedings of BCS IHM-HCI 2001 (Lille, France), Springer,531-540. Adobe PDF
  • Brewster, S.A., Lumsden, J., Bell, M., Hall, M. and Tasker, S. Multimodal 'Eyes-Free' Interaction Techniques for Wearable Devices. In Proceedings of ACM CHI 2003 (Fort Lauderdale, FL). ACM Press, Addison-Wesley, pp 463-480. Adobe PDF
  • McGookin, D. K. and Brewster, S. A. DOLPHIN: The Design and Initial Evaluation of Multimodal Focus and Context. In Proceedings of ICAD 2002 (Kyoto, Japan), ICAD, pp 181-186. Adobe PDF
  • McGookin, D. and Brewster, S.A. An Investigation into the Identification of Concurrently Presented Earcons. In Proceedings of ICAD 2003 (Boston, MA). ICAD. Adobe PDF

Our work on 3D audio is now focused on mobile devices and applications for blind people (see the sections below). David McGookin is looking at the design of earcons to be presented in 3D.


Interfaces to mobile computing devices and wearable computers

We are also working on using sound to enhance the interfaces of mobile and wearable computers. Mobile computing devices are becoming extremely popular. Mobile telephones, Personal Digital Assistants (PDAs) and handheld computers are one of the fastest growth areas of computing. One problem with these devices is that there is a very limited amount of screen space: the screen cannot physically be made bigger as the devices must be able to fit into the hand or pocket to be easily carried. As the screen is small it can become cluttered with information as designers try to cram on as much as possible. This has resulted in devices that are hard to use, with small text that is hard to read, cramped graphics and little contextual information.

One possibile solution is to use sound to present information about widgets so that their size could be reduced. This would mean that the clutter on the display could be reduced and/or allow more information to be presented on the display. This must be done in a way that maintains usability otherwise these smaller widgets will render the device unusable

We have had three projects in this area. The first project (with Nokia) looked at how we might use non-speech sounds to aid navigation around complex non-visual menu structures, such as occur on a mobile phone. Full details of this project can be found on the Telephone project web pages.

The Toolkit project looked at the design of a toolkit of resource sensitive widgets that could reconfigure themselves (and so the display). This allowed the easy movement from a desktop to a mobile device, with information moved from the visual to auditory modality to make up for the lack of screen space. In this project we also began to look at the use of 3D sound on mobiles.We have been able to reduce the size of widgets such as buttons with the addition of sound but keep usability levels high. This allows more information to be put on-screen or for displays to be made less cluttered.

The final project, the AudioClouds project, is looking at how 3D sound and gestures can combine to make effective interfaces for mobile devices. Gestures are good for input as they do not require an visual attention. 3D sound is good for display as it gives lots of display space. The project web site has all of the details.

We are involved in the MobileHCI series of workshops - we started them in Glasgow and are holding MobileHCI 2004 at Strathclude University in Glasgow.


Haptic interaction

The Group has done much work in the area of haptic (also known as touch ) interaction. This provides another sense that can be used for multimodal interaction in addition to hearing and sight. The technology to do this is still quite new. It was first developed so that users could feel objects in virtual environments. Minsky (in Blattner & Dannenberg, 1992) describes the technology thus: "Force display technology works by using mechanical actuators to apply forces to the user. By simulating the physics of the user’s virtual world, we can compute these forces in real-time, and then send them to the actuators so that the user feels them". See our haptics pages for the details of the work we are doing in this area.

We are interested in gestural interaction for use in mobile and wearable devices, force-feedback for desktop interfaces, blind users, and tactile interfaces. We are working on Tactons, or tactile icons, which are similar to earcons in that they can convey structured messages, but using vibration against the skin rather than sound.


Multimodal visualisation for blind people (Multivis)

The MultiVis project is investigating the presentation of visualisations to blind people. This project uses the work on earcons, 3D audio and haptics described above to try to present graphs, tables, 3D plots and other visualisations to blind people. The project Web pages have more details.

One of the main deprivations caused by blindness is the problem of access to information. Visualisation is an increasingly important method for people to understand complex information (using tables, graphs and 3D plots, etc.) and also to navigate around structured information. Computer-based visualisation techniques, however, depend almost entirely on high-resolution graphics and for visually-impaired users the problems of using complex visual displays are great. There are currently only limited methods for presenting information non-visually and these do not provide an equivalent speed and ease of use to their graphical counterparts. This means it is impossible for blind people to use visualisation techniques, so depriving them further. We are investigating this problem by using techniques from Virtual Reality (VR) that allow users to feel and hear their data.

The innovative aspect of this project has been to investigate the different sensory modalities to see how they can best be used for visualisation and so create a powerful, multimodal visualisation system that makes the most of the senses our users have. We will be using force-feedback, 3D sound, braille, speech input and output to try and overcome the problems caused by the lack of vision. The second stage of this project will begin in October 2004 and will investigate navigation, twohanded interaction and external memory issues in visualisations for blind people.


User interface design for older adults (UTOPIA)

UTOPIA (Usable Technologies for Older People: Inclusive and Appropriate) is a Scottish research project investigating the design and development of computer-based technology for older people. It is formed from a partnership of research groups at four universities (Dundee, Glasgow, Abertay and Napier). .

The proportion of older people in the population is increasing and with it the demands on long-term care and help for their particular needs. Although many older people are independent and provide much to the community, as we grow older, we will, in general, experience a reduction in our abilities and usually require support in some activities, eventually even the basic activities of life.

The Glasgow part of this project is looking at interface design of mobile and handheld devices and mobile navigation aids for older people. More details on our UTOPIA web page.

We ran a workshop at BCS HCI2002 on "A New Research Agenda for Older Adults", the website has all of the papers that were presented. The proceedings of the workshop were published as a special issue of the journal Universal Access in the Information Society. We ran a second workshop at BCS HCI 2004 on HCI and the Older Population.

 

If you have any questions about the projects described on this page, or want more details, then feel free to email me.