Glasgow Interactive Systems Group (GIST) logo - click here to go to the GIST Website

The Multimodal Interaction Group website has moved. Click here to visit the new site

Principles for Improving Interaction in Telephone-Based Interfaces

EPSRC project GR/L66373

Oct/97 - Oct/2000

Experiment 2

With this early experiment we wanted to investigate new principles for representing hierarchical menus such as telephone-based interface menus with non-speech audio. The sounds used were designed to test the efficiency of using specific features of a musical language to provide navigation cues. All the sounds were designed using a single instrument: the piano. Therefore the only differences between the sounds was their syntactic structure.

A hierarchy of 25 nodes with a sound for each node was used. Participants (half musicians and half non-musicians) were asked to identify the position of the sounds in the hierarchy. The overall recall rate of 86% suggests that syntactic features of a musical language of representation can be used as meaningful navigation cues.

You can listen to the sounds used in this experiment by clicking on the nodes of the menu tree below. Bear in mind that these sounds have been designed for exploratory purposes only. They have been recorded at CD quality (16 bits, 44.1 KHz) and down-sampled to the Sun .au format (8 bits, 8 KHz), which was the only format supported by Java when the experiment was performed.

This work was published at ICAD'98 - full details of the results can be found there.


Back to the Telephone Project home page.