Glasgow Interactive Systems Group (GIST) logo - click here to go to the GIST Website

Multimodal Interaction Group

Research
Conferences, Workshops, Journals
Group Publications
Demos
Links
Jobs in the Group
Teaching

Principles for Improving Interaction in Telephone-Based Interfaces

EPSRC project GR/L66373

Oct/97 - Oct/2000

Project abstract

Telephone-based interfaces (TBIs) are an increasingly important method of interacting with computers systems (such as electronic banking and voicemail). Telephones themselves are also incorporating greater functionality (such as address books and call forwarding). In both cases this extra functionality may be rendered useless if usablity is not considered. One common usability problem is users getting lost when navigating through hierarchies of options of functions. This may mean that some functions are not used or that users cannot the goals they wish.

The innovative aspect of this proposal is to use structured non-speech sounds (such as short pieces of music) to enhance the output of information in TBIs. Sound can present information rapidly without getting in the way of any speech output. I will investigate the use of sound to provide navigation cues to stop users getting lost and also to provide richer output methods to create more flexible interaction techniques. To ensure effectiveness i will perform full usability evalustion. TBI designers will benefit from this research because guidelines produced will enable them to create more powerful interfaces. End users will benefit because the resulting telephones and telephon services will be more usable.

The full Case for Support submitted to the EPSRC

Project final report

Researchers

Principal investigator was Prof Stephen Brewster. Gregory Leplatre was employed on the project as a research student. Seppo Helle from Nokia was involved in the project as an industrial collaborator.

Earcons and telephone-based interfaces

We have done several experiments to investigate the use of earcons for representing hierarchical information. This has applications for telephone-based interfaces (such as phone-banking or voicemail), the design of mobile phones and blind computer users. Telephone-based interfaces (TBIs) are becoming an increasingly important method for interacting with computer systems. The telephone is an ubiquitous device and is many people’s primary method of entry into the information infrastructure. Access to an increasing number of services is being offered over the telephone, such as voice-mail, electronic banking and even Web pages. The rapidly increasing use of mobile telephones means that people access these services at many different times and places. Telephones themselves are now also incorporating greater functionality (such as multi-party calling, address books, diaries or call forwarding). The provision of this extra functionality may be rendered useless if usability issues are not considered.

Little use has been made of structured non-speech sounds in TBIs. Our work has shown that the use of such sounds can increase the bandwidth of communication between the system and the user, allowing a richer interaction. These ‘multimedia’ telephone interfaces are more usable than their current eqivalents. Sound has many advantages. For example, it is good for communicating information quickly. Unlike speech, non-speech sound is universal; the user is not tied to one language, which is important for the increased international use of computer systems. There is also great potential for the results of this work in other non-graphical interfaces such as those for visually disabled people and those where working conditions or protective clothing mean that a screen cannot be used.

The project

These pages will tell you more about this general area of research (also have a look at Gregory Leplatre's web pages). There is a hypercard stack full of earcons to download too. We have several more sets of earcons that we can supply if you contact us.

There is an initial java demo of how temporal cues can be used for navigation (this made up experiment 2 of our project). We then developed a detailed Java simulation of how sounds might be used for navigation in a real mobile phone (which formed experiment 3 of our project). This can be downloaded - please let us know if you use it.

To enable interface designers who are not sound experts to develop sonically-enhanced interfaces we have developed a sophisticated tool to allow the creation of hierarchical earcons. You can download this - again let us know if you use it.

In the final stages of the project we investigated soundgraphs to present complex, continous data. We looked at presenting stock market data on a mobile telephone. The value of the stock was mapped to the pitch of the note, so that users heard a change in pitch over time. Full details can be found in the paper published. If you are interested we can give you the Java application we used to test our ideas.

Publications

The main publications in this area are shown below, the numbered items were produced as part of the project (the group publications list has the most up to date complete list of papers):

  • Brewster, S.A., Wright, P.C. & Edwards, A.D.N. (1993). An evaluation of earcons for use in auditory human-computer interfaces. In S. Ashlund, K. Mullet, A. Henderson, E. Hollnagel, & T. White (Eds.), Proceedings of InterCHI'93, Amsterdam: ACM Press, Addison-Wesley, pp. 222-227. Adobe PDF
  • Brewster, S.A. Raty, V.-P. & Kortekangas, A. (1996). Earcons as a Method of Providing Navigational Cues in a Menu Hierarchy. In Proceedings of HCI'96 (Imperial College, London, UK), Springer, pp 167-183. Adobe PDF
  • Brewster, S.A. (1997). Navigating telephone-based interfaces with earcons. In Proceedings of BCS HCI'97 (Bristol, UK), Springer Verlag, pp 39-56. Adobe PDF
  1. Brewster, S.A., Capriotti, A., Hall, C.V. (1998). Using compound earcons to represent hierarchies. In HCI Letters, 1(1), pp 6-8. Adobe PDF
  2. Brewster, S.A., Leplâtre, G. and Crease, M.G. Using non-speech sounds in mobile computing devices. In Proceedings of the First Workshop on Human Computer Interaction for Mobile Devices (Glasgow, UK) Department of Computing Science, University of Glasgow, 1998, pp. 26-29.
  3. Brewster, S.A. (1998). Using non-speech sounds to provide navigation cues. ACM Transactions on Computer-Human Interaction, 5(2), pp 224-259. Adobe PDF
  4. Leplatre, G., and Brewster, S. A. (1998). Perspectives on the Design of Musical Auditory Interfaces. Focus conference on Anticipation, Cognition and Music. Liege, Belgium, Aug. 1998. In Dubois D. (Ed) International Journal of Computing Anticipatory Systems, Feb. 1999. Adobe PDF
  5. Leplatre, G., Brewster, S.A. (1998). An Investigation of Using Music to Provide Navigation Cues. In Proceedings of ICAD'98 (Glasgow, UK), British Computer Society. Adobe PDF
  6. Leplatre, G. and Brewster, S.A. Improving the Design of Telephone-Based Interfaces. In Volume II of the Proceedings of INTERACT '99 (Edinburgh, UK) British Computer Society, 1999, pp. 45-46.
  7. Brewster, S.A. Using Earcons to Provide Navigation Cues in Telephone-Based Interfaces. ACM Interactions 6, 2 (1999), 9-10.
  8. Leplatre, G. and Brewster, S.A. (2000). Designing Non-Speech Sounds to Support Navigation in Mobile Phone Menus. In Proceedings of ICAD2000 (Atlanta, USA), ICAD, pp 190-199. Adobe PDF
  9. Brewster, S.A., Crossan, A. and Crease, M. (2000). Automatic volume control for auditory interfaces. In Volume II Proceedings of BCS HCI 2000 (Sunderland, UK), pp 17-18. Adobe PDF
  10. Leplatre, G. (2000). Using Non-Speech Sounds to Improve Interaction with Telephone-Based Interfaces. Volume II proceedings of HCI 2000 (Sunderland UK), pp 142-143.
  11. Brewster, S.A. and Murray, R. (2000). Presenting dynamic information on mobile computers.Personal Technologies, 4(2), pp 209-212. Adobe PDF
  12. Leplatre, G. (2001). The design and evaluation of non-speech sounds to support navigation in restricted display devices. PhD thesis, Glasgow University, UK. To be submitted Spring 2001. Adobe PDF. Sound files associated with thesis.
  13. Helle, S., Leplâtre, G., Laine, P. and Marila, J. (2001). Menu sonification in a mobile phone - a prototype study. In proceedings of ICAD2001 (Helsinki, Finland).

See our research pages for other projects going on within the Multimodal Interaction Group.