Title:
Earcons and user interfaces

Abstract:
I will talk about the use of non-speech audio messages called earcons and
how they can be integrated into the human-computer interface to improve
usability. I will describe earcons and how they are constructed. I will
then talk about two experiments I conducted with colleagues at the
University of York where earcons were used to improve interactions.

In the first example sounds were added to a scrollbar to overcome two
usability problems: losing ones sense of position in a document and
'kangarooing' (where the scrollbar thumb wheel can unexpectedly move up and
down around the cursor position). Results from this experiment showed that
mental workload was reduced by the addition of sound, participants
completed the tasks more quickly and recovered from errors more quickly.

In the second example, earcons were added to screen buttons. One usability
problem with buttons is that users can slip-off them by mistake. Sounds
were added to help users recover from such errors. The results showed that
participants did recover significantly more quickly, required fewer
mouseclicks  to recover and preferred the sonically-enhanced buttons.

In both of these experiments we measured the annoyance felt by
participants. It is often claimed that sounds would be too annoying to use
at the interface. Our results showed that participants did not find them
any more annoying.