Videos of Inference, Dynamics and Interaction group work at Glasgow University

Touching the micron

  • S. Lamont, R. Bowman, J. Williamson, M. Rath, R. Murray-Smith, M. Padgett, Touching the Micron: Tactile Interactions with an Optical Tweezer, MobileHCI 2012 pdf video

Rewarding the original

video
  • Williamson, J., and Murray-Smith, R., Rewarding the original: explorations in joint user-sensor motion spaces. In: ACM Annual Conference on Human Factors in Computing Systems, 5-10 May 2012, Austin, Texas. pdf video

Anglepose

video
  • S Rogers, J. Williamson, C. Stewart, R. Murray-Smith, AnglePose: robust, precise capacitive touch tracking via 3D orientation estimation, ACM SIG CHI 2011. pdf Video

1. Brain Computer Interaction

video (38Mb) mp4 (17Mb
  • B. Blankertz, G. Dornhege, M. Krauledat, M. Schröder, J. Williamson, R. Murray-Smith, K.R. Müller, The Berlin Brain-Computer Interface presents the novel mental typewriter Hex-o-Spell, 3rd International BCI Workshop and Training Course, Graz, 2006. pdf
  • B. Blankertz, M. Krauledat, G. Dornhege, J. Williamson, R. Murray-Smith, and K.-R. Müller, A Note on Brain Actuated Spelling with the Berlin Brain-Computer Interface, HCI International, China, 2007. pdf

2. Stane tactile input, and bearing-based Mobile Spatial Interaction video.

 

3. Using the Body Space approach to allow the user to answer the phone just by bringing the phone to the listening position. In this video you see examples from stationary, while walking, and where the user waves the phone around first. The phone is not responding to 'any' movement, but just one compatible with being brought to a listening position. mp4. Here is another Body Space video where a music player is controlled by using body locations to determine content or function. In this clip, songs are stored around the right shoulder, and can be browsed and selected just by hand movements. The volume control is located near the hip, and the back/forward track is at the ear. The BodySpace webpages give more background.

S. Strachan, R. Murray-Smith, S. O’Modhrain, BodySpace: inferring body pose for natural control of a music player, Extended abstracts of ACM SIG CHI Conference, San Jose, 2007. pdf video (mp4)

 

 

 

 

4. Shoogle is an interface for actively feeling the content of your phone. To test whether there are new SMS messages or e-mails, just give it a shake. It feels as if there are balls bouncing around in the phone, if there are new messages. The impact sound gives you information about who sent it, and what sort of message it is. This is a general technique for coupling inference mechanisms and multimodal interaction. mp4-video

5. Body Space, MESH and MoodPlayer demo. Quicktime Video (16Mb) Quicktime Video (Streaming) This video shows Syntonetic's Moodplayer linked up to a Pocket PC/MESH system via Bluetooth.

.

See this paper for more details about the Body Space concepts:

  • S. Strachan, R. Murray-Smith, I. Oakley, J. Ängeslevä, Dynamic Primitives for Gestural Interaction, Mobile Human-Computer Interaction – MobileHCI 2004: 6th International Symposium, Glasgow, UK, September 13-16, 2004. Proceedings. Stephen Brewster, Mark Dunlop (Eds), LNCS 3160, Springer-Verlag, p325-330, 2004. pdf SpringerLink
  • S. Strachan, R. Murray-Smith, S. O’Modhrain, BodySpace: inferring body pose for natural control of a music player, Extended abstracts of ACM SIG CHI Conference, San Jose, 2007. pdf video

6. Tremor control of a PocketPC (S. Strachan, R. Murray-Smith, Muscle Tremor as an Input Mechanism, UIST 2004, Santa Fe, 2004. pdf)

7. Multimodal Speed Dependent Automatic Zooming. Version with stylus interaction

  • P. Eslambochilar, J. Williamson, R. Murray-Smith, Multimodal Feedback for tilt controlled Speed Dependent Automatic Zooming, UIST 2004, Santa Fe, 2004. pdf
  • P. Eslambolchilar, R.Murray-Smith, Tilt-based Automatic Zooming and Scaling in Mobile Devices - a state-space implementation, Mobile HCI, 2004.

8. Tilt-interaction with a mobile phone emulator - version 1

9. Tilt-interaction with a mobile phone emulator - version 2

10. Haptic granular synthesis (A. Crossan, J. Williamson, R. Murray-Smith, Haptic Granular Synthesis: Targeting, Visualisation and Texturing, International Symposium on Non-visual & Multimodal Visualization, London, IEEE Computer Society, 2004 pdf)

11. Text entry video (11Mb) (J. Williamson, R. Murray-Smith, Dynamics and probabilistic text entry, DCS Technical Report TR-2003-147, Department of Computing Science, Glasgow University, June, 2003. pdf )

12. Haptic dancing (S. Gentry, R. Murray-Smith, Haptic dancing: human performance at haptic decoding with a vocabulary, IEEE International conference on Systems Man and Cybernetics, Washington, D.C., USA, 2003 pdf )

13. Two-player pong, via Bluetooth with accelerometer input. An experimental platform for exploring display-free games via multimodal feedback.

14. Xsens P3C Accelerometer with Bluetooth link, and haptic feedback for control of aircraft in X-plane simulator:

15. MP3 file selection via tap or accelerometer input. An implementation of the pointing without a pointer approach to selection, where the display modality is audio, and the input is tapping or shaking, depending on mode. The correlation between the rhythm of the track and the rhythm of the tapping is used to select the song. The ambiguity of the user's tapping is visible in the width of the red band at the top of the display.

16. Navigating a campus with audio and vibration feedback, based on uncertain location and orientation sensing.

  • J. Williamson, S. Strachan, R. Murray-Smith, It’s a Long Way to Monte-Carlo: Probabilistic GPS Navigation, Proceedings of Mobile HCI 2006, Helsinki, 2006. pdf video
  • S. Strachan, J. Williamson, R. Murray-Smith, Show me the way to Monte Carlo: density-based trajectory navigation, Proceedings of ACM SIG CHI Conference, San Jose, 2007.

17. Active selection with the 'eggheads' metaphor. A number of 'heads' experience orientation disturbances. Input motion is applied to all heads equally. By cancelling the disturbance, selection is achieved. The demo can be downloaded.

18. Active selection video for Brownian motion targets. Each individual target moves on a smooth, independent course. The user's mouse actions are applied equally to the trailing targets. Correlating motion results in selection. The demo can be downloaded.


19. Tilt-based photo browsing on a phone.

From: S. J. Cho, R. Murray-Smith, C. Choi, Y. Sung, K. Lee, Y-B. Kim, Dynamics of Tilt-based Browsing on Mobile Devices, Extended abstracts of ACM SIG CHI Conference, San Jose, 2007. pdf video mp4

20. Steve Strachan's "Star Wars Light Sabre" demo, running on the Nokia 5500.. The latency is an issue and it's a very basic system but it's still a fun demo...The app was programmed using Python for S60.