Integrating Field and Systemic Data in a Visualisation for Collaboration  

Developers: Alistair Morrison and Paul Tennent

Download: Java + example data set + intro video (beta, via Alistair Morrison’s web site)

Documentation: PDF and the papers below

Funding: Equator, an EPSRC-funded interdisciplinary research collaboration, and ESRC e-Social Science small grant.

The overall aim of Replayer is to help those evaluating and designing for ubiquitous computing in everyday life, e.g. mobile phones used in leisure, entertainment and business. We aimed to address some of the difficulties of such evaluation by letting them combine, analyse and visualise the many types of record that they collect and create. This includes tools for one or more streams of audio or video, logs of events recorded by computer programs, spatial data on what happened when and where, and statistical tools such as histograms and mutual information. Also, by making the system out of several parts that can be spread across the computers of programmers and sociologists engaged in such research, they can work together as they analyse this data.

Replayer began in the PhD work of Glasgow student Paul Tennent, supervised by Matthew Chalmers and set within the Equator interdisciplinary research collaboration. In the spring of 2006, Dr. Alistair Morrison started work on improving and extending Replayer. New features include finding video recordings of particular people within a large amount of data. While some researchers have tried to automatically analyse video, to spot who is in each video frame, we focused on using other data such as records on who was where when. By tracking the locations of not only the participants being observed but also the observers and their video cameras, we can estimate which participant was recorded when. We added simple controls in Replayer to enter the names of one or more people to search for, and Replayer responds by showing a timeline of all the video in which one can see those people. One can then press ‘play’ to watch it all, one can jump to particular segments, or one can look at other data associated with these in order to further narrow down on the particular video segments one wants to examine.

Apart from our own work, U. Nottingham researchers at DReSS have experimented with the system to analyse a ubicomp experience. Also at Glasgow, Phil Gray’s group has used Replayer for analysing archaeological survey data and, as part of the EU OpenInterface project, data from sensors that track fine-grained body movements.

We are making a public beta version of Replayer available here as open source, so that others can get it and use it in their research. If you use it, please email us to let us know, as we’d like to know a little about how you use it and what you use it for. Bug reports and suggestions are welcome but we may not be able to act on them until later this year.


Using Location, Bearing and Motion Data to Filter Video and System Logs

Alistair Morrison, Paul Tennent, Matthew Chalmers and John Williamson

Proc. Pervasive 2007, Toronto, 109-126

Coordinated Visualisation of Video and System Log Data

Alistair Morrison, Paul Tennent, Matthew Chalmers
Proc. 4th Intl. Conf. on Coordinated & Multiple Views in Exploratory Visualization (CMV) 2006, 91-102

Replayer: Collaborative evaluation of mobile applications

Paul Tennent, Alistair Morrison, Matthew Chalmers

Proc. Workshop on Information Visualization and Interaction Techniques for Collaboration Across Multiple Displays, ACM CHI 2006, Montreal

Auto-classifying Salient Content In Video

Alistair Morrison, Paul Tennent, John Williamson, Matthew Chalmers

Proc. Workshop on Computer Assisted Recording, Pre-Processing, and Analysis of User Interaction Data, BCS HCI 2006, London.

Supporting Ethnographic Studies of Ubiquitous Computing in the Wild

Andy Crabtree, Steve Benford, Chris Greenhalgh, Paul Tennent, Matthew Chalmers, Barry Brown
Proc. ACM Designing Interactive Systems (DIS) 2006, 60-69.

Recording and Understanding Mobile People and Mobile Technology

Paul Tennent and Matthew Chalmers

Proc. 1st . Intl. Conf. on eSocial Science, Manchester, 2005


Java executable and source available as beta in a 837MB zipped package including Java source, example data set and overview video. To be honest, we do know that this software is quite hard to get into... but it offers a good deal when you do. We also plan to do more with regard to making interaction better, and Alistair Morrison continues to do just that.


An overview video is available here. It’s the same one downloadable in the package mentioned above, along with the source and executable.

Also, the links below are for videos showing an earlier version of Replayer:







Trails are formed by the logged locations of two participants of a ubicomp experiment shown overlaid on an aerial photograph in Google Earth by Replayer (left). Then, triangles in the right image show the visual fields of static evaluators’ cameras, and participants’ locations potentially captured by the cameras are highlighted in red. By comparing the log data for those locations with video timestamps, Replayer automatically retrieves video segments showing chosen events or participants—thus easing a common but arduous task for those analysing mobile technology in use.

Two video streams playing in synchronisation, with a timeline for each shown underneath.  The analyst has selected one trial participant, and the green highlighting on the timelines shows the periods in each of the videos where the user appears based on the techniques described in the figure above. Video playback can be set to skip periods where the participant does not appear in either video stream.

An event series (left) shows sensor data from an accelerometer carried by a trial participant. This shows steady walking, then a short period of running with higher amplitude movements, and then period of intermediate activity (selected and highlighted in blue). Selecting this period cues video (right), which shows that the associated activity was from walking down stairs.

Replayer can use Google Earth to display spatial aspects of data. Here, logs of system events from a mobile multiplayer game (Treasure) are filtered to only show those happening when within wi-fi coverage. The resulting arc shows the distribution of wi-fi within the urban area used for game play.

An overview of four Replayer tools, showing data from the Treasure mobile multiplayer game. Clockwise from top left: a video component with two video streams, a playback control and a timeline of available video; a chart of logged system events; a log display and histogram; and Google Earth used for spatial data display.

A spatial display from an older Replayer version, showing the locations of logged system events in the Treasure mobile multiplayer game, and also red and yellow shading that shows wifi coverage: a key infrastructure resource used as a game element.