Glasgow Interactive Systems Group (GIST) logo - click here to go to the GIST Website

The Multimodal Interaction Group website has moved. Click here to visit the new site

Positions Available in the Multimodal Interaction Group

I am always on the look out for good Masters and PhD students to work with me in the area of human-computer interaction. I currently have computer scientists, psychologists and musicians working with me. If your background is different but you are keen on research contact me anyway. For other jobs in the Department check out the Department jobs page. More details on PhDs in the Department are available.

Here are some other topics of current interest:

  • Mobile/Wearable computers: I am interested in designing user interfaces for wearables. This is tricky as they normally have very small displays. My take on this is to use things like 3D sound, audio (speech and music), gestures and tactile displays for the interaction. The question is how do you design effective interfaces using these methods? how do you do good gesture recognition when the user is moving? Check out this paper about one of our early experiments in this area www.dcs.gla.ac.uk/~stephen/papers/CHI2003.pdf
  • Evaluation of wearable computers: you cant just test a wearable computer in a standard usability lab. It is supposed to be used on the move. We therefore need to develop new techniques to allow them to be evaluated whilst the user is walking, running or moving about in other ways. We are beginning to look at a range of techniques such as the users walking speed, ratings of comfort and acceptability. I am keen to try other things like physiological measures (heart rate, GSR, etc) to see if we can find other ways of measuring usability on the move.
  • Force-feedback (haptic) user interfaces: we are using state-of-the-art force feedback devices to allow people to use their sense of touch to feel virtual objects. We have used our devices for training medical students, for example, as they need to learn how to palpate patients safely. For a PhD I am keen to look at how we might use forcefeedback at the desktop. Could we create haptic widgets like buttons and scrollbars to make them easier to use? What other new interaction techniques could we create if we can use our sense of touch? For more details on what we do with haptics see www.dcs.gla.ac.uk/~steven/haptics.htm
  • Interfaces for blind users: we have a long running interest in designing interfaces for blind people and in particular to allow them to access things like graphs, tables, complex data. We do this using force feedback and audio interfaces. There are lots of problems to be solved to display complex information non-graphically. For more see www.multivis.org

This just a very quick overview, if any of these sound interesting then let me know and we can arrange to have a chat in more detail. You check out my web pages: www.dcs.gla.ac.uk/~stephen/research.shtml, and the dept PhD pages: www.dcs.gla.ac.uk/phd


My main research interests are:

  • Multimodal human-computer interaction
  • Sound in graphical human-computer interfaces
  • 3D sound
  • Gestural interaction
  • Telephone-based interfaces and the design of interfaces for mobile computers such as PDAs, HPCs and mobile phones
  • Interfaces to portable devices such as PDAs and HPCs
  • Haptics
  • The use of force-feedback (haptic) devices for human-computer interaction
  • Sonification/Perceptualisation of data
  • Any or all of these applied to helping people who have visual disabilities

I am really interested in how to use senses such as touch and hearing to convey information. Multimodal interfaces use more sensory modalities than vision (which tends to dominate current human-computer interactions) to improve usability. These other senses are currently under-utilised in human-computer interaction. To find out more about the research work we are currently doing go to the Research pages.

One of my long-standing interests is in integrating sound into standard graphical human computer interactions. There are lots of papers about this on my Publications List if you want to know more.

I am interested in the use of non-speech sounds (such as earcons) to improve the usability of telephone-based interfaces (for example, telephone banking, telephone information systems). These are becoming increasingly common as more people get mobile phones and call up remote information services. Currently these systems are very hard to use and I am working on ways to improve them.

I am also interested in the interfaces for mobile computers, personal digital assistants (PDAs) and hand-held PCs (HPCs) as they have very small screens giving a narrow bandwidth of communication between the user and the device. There are lots of challenging issues to be investigated before these devices can be more than glorified diaries. Using sound is one way of overcoming the problems. We are also looking at 3D sounds (sounds that appear to come from anywhere around you in 3D space - just like they do in the real world) to present information - you carry an 'audio space' with you and you can hear multiple sources of sound. How to design this so that people don't get overloaded and can use the information is a key problem that needs investigation. Gestural input is being used to control the audio space surrounding the user.

Force-feedback is a new area which is just opening up. Such devices (initially developed for VR systems) allow users to feel virtual objects as if they were really there. There has been little research into how such devices could be used to improve standard human-computer interfaces and I am very keen to address this. We have some of the best equipment in the UK to investigate this area and are collaborating with many groups to develop the use of this state-of-the-art technology.

Sonification and perceptualisation are the equivalents of the above applied to visualisation. When visualising complex data users can become overloaded with visual information. Why not use sound and touch to provide feedback to other senses and so take the load off your eyes? if you are blind or cannot see a screen then how can you get at complex data? We are investigating how this can be done an much more work is needed.

All of the ideas above can be used to help sighted people use their computers more effectively. However, I am also interested in applying them to the problems faced by blind and partially-sighted people using computers. Using sound and touch can radically improve usability for these users but there are still many research questions to be answered.

I am keen to take on psychologists, computer scientists and musicians with an interest and enthusiasm in any of the above areas.


If you are interested in these areas then email me and we can talk about possibilities. For more information about doing a PhD in Glasgow look at the PhD admissions web pages we also have a Masters in Information Technology for those who might want to get some more computing experience.

Application form and further details available from Stephen Brewster

 

This page has been accessed Picture that shows the number of times the page has been accessed times (since 19/April/2004).